an implementation of the hmd-enabled interface and … · genre game has been integrated with both...
TRANSCRIPT
AN IMPLEMENTATION OF THE HMD-ENABLED INTERFACE AND SYSTEM USERBILITY TEST
SANGHOON SHIN, SEOKKYOO KIM, JUNO CHANG
Game Design and Development, Sangmyung University, 20 Hongjimoon-2-gil Seoul, 110-743, Republic of Korea
KYUNGEUN PARK
Computer and Information Sciences, 8000 York Road Towson, MD 212252, U.S.A.
ABSTRACT: This research includes the study of the existing head-mounted display (HMD)-enabled interface program and suggests improved interface scheme of HMD-enabled virtual reality (VR) games. The traditional keyboard and mouse based input schemes need to be replaced by user-friendly interface scheme such as the HMD. The higher the immersion degree of game users is expected, the less user intervention is desired to capture a user input and to give an output to users. For this reason, a user-centric VR environment has been built, where users’ motions such as turning head and raising hands are recognized as meaningful input to the system with the well-known contact type, Hydra, and non-contact type, Kinect system. Especially, three multiple Kinects are arranged to improve the motion recognition rate. The proposed HMD interface not only aims to increase the degree of mutual interaction between game players and VR games, but also enhances usability of the HMD. Diverse users’ gestures are recognized in combination with newly suggested interfaces. The improved interfaces have been integrated with both representative testbed games which belong to action and sports genres. Action game has been developed and interfaced with non-wearable (non-contact) interface, whereas sports game has been integrated with both contact and non-contact interfaces. After then, the effectiveness and usability of the interface have been assessed according to the empirical evaluation scheme with modified system usability scale (SUS) statements. The SUS test result shows that the proposed interface enhanced the system usability as well as the system interactivity in terms of user satisfaction. KEYWORDS Virtual Reality (VR), Cyberspace, Head-Mounted Display (HMD), System Usability Scale (SUS). INTRODUCTION Virtual reality (VR) refers to an overall theoretical basis and its applied technology to provide users with indirect experience of a cyberspace with multi-sensing mechanisms. In other words, it makes people feel and act in the computer generated cyberspace as if they are in a real world by allowing a variety of input/output devices to be used to promote mutual interaction between users and computers. Brooks defines a virtual reality experience as any in which the user is effectively immersed in a responsive virtual world, Brooks (1999). The traditional keyboard and mouse based input schemes need to be replaced by user-friendly interface scheme which could minimize any constraints in getting a user input as well as in giving an output to users. Because it causes spatial constraints by restricting users’ active space to a desktop environment and limits the source of input for interaction to wired contact interfaces. In order to maximize immersion degree, many new interface schemes including contact and non-contact interfaces have been developed.
1st International Symposium on Simulation & Serious Games 2014 (ISSSG 2014)Copyright © 2014 ISSSG Organisers. All rights reserved.ISBN: 978-981-09-0463-0doi:10.3850/978-981-09-0463-0 041 183
1st International Symposium on Simulation & Serious Games 2014 (ISSSG 2014)
In 1960s, Sutherland introduced a paper about the Sketchpad system which presented a new way of man-machine communication method based on line drawing which makes it possible for a man and a computer to converse rapidly through the medium of line drawings, Sutherland (1963). After then, the first graphics-driven head-mounted display (HMD) was pioneered by Ivan Sutherland by presenting user with a perspective image which changes as he moves in order to create the illusion effect that he is seeing three-dimensional images, Sutherland (1968), Rolland & Hua (2005), based on the perspective transformation algorithm and a wide variety of equipment. Up to the present, a lot of effort has been put into the development of the highly immersive interface to enhance the recognition rate in the motion-controlled video games combined with various motion controllers. Recent technology applied to those motion controllers makes it possible to distinguish a variety of different users’ motions like turning head or raising hands. Thus, this research is focusing on improving the quality of the head-mounted display (HMD)-enabled interface scheme of VR games. The proposed interface scheme adopts multiple non-contact motion sensing interface, Kinect developed by Microsoft, to increase the motion recognition rate. The interface has been integrated with both testbed games which belong to action and sports genres. After comparison analysis, the well-known contact type, Hydra, and non-contact type, Kinect system have been selected as the experimental interface schemes. Action genre game has been developed and interfaced with non-contact interface, Kinect, whereas sports genre game has been integrated with both contact, Hydra, and non-contact, Kinect, interfaces. After then, the interactivity and usability of the interface has been assessed according to the empirical evaluation scheme with modified system usability scale (SUS) statements. The SUS test result shows that the proposed interface enhanced the system usability as well as the system applicability. This paper is composed of 5 chapters in total. Chapters 1 and 2 present the introduction and the related research. Chapter 3 explains the interface improvement including the description of the testbed games and user feedback. In Chapter 4, the analysis of the SUS test results according to the interface improvement is discussed. Chapter 5 concludes with future research direction. RELATED RESEARCH Interface for computer game allows users to communicate with the game, helps both sides understand each other, and works as an interpreter between them. As an advanced three-dimensional interface rather than two-dimensional interface including mouse, keyboard, and touchpad, Bolt (1980) proposed a media room environment where voice and gesture inputs at the graphics interface can converge to provide a concerted, natural user modality, Bolt (1980). Generally, a typical motion sensing interface is divided into contact and non-contact interfaces. Contact interfaces require sensor devices to be attached to a user and works depending the on the captured motion information. Whereas, non-contact interfaces are based on users’ gesture tracked by multiple cameras, Bolt (1980). Contact Interfaces Contact interfaces are supported by various motion controllers including WiiMote, PSMove, Hydra, Data Glove, etc. WiiMote (see Figure 1), a game controller of Wii by Nintendo, recognizes users’ motion based on acceleration sensor and tilt sensor in 3-dimensional space. As a result, it distinguishes different motions from aiming to swing. Figure 2 shows PlayStation Move motion
184
1st International Symposium on Simulation & Serious Games 2014 (ISSSG 2014)
controller presented by Sony. It is regarded as superior to WiiMote in accuracy. Another variation of stick style interface called Hydra (see Figure 3) allows users to interact with wired PC and is more accurate than the above wireless interface schemes. In addition to stick style interface, wearable gloves, data gloves, equipped with many sensors are widely used with comprehensive motion of hand and fingers recognized as controlling information (see Figure 4).
Figure 1. WiiMote
Figure 2. PSMove
Figure 3. Hydra Controller
Figure 4. Data Glove
The following table compares previously introduced contact interfaces in terms of available platform, commercialization status, user preference, advantage, and disadvantage. Table 1. Contact Interfaces
Interfaces Platform Comm. status
Preference Advantage Disadvantage
Data Glove PC No Low Various gestures Inconveniences of wearing
Hydra PC Yes High High recognition rate Wired
WiiMote Wii Yes High High availability Not available with PC
PSMove PS3 Yes High Interaction using light Not available with PC
185
1st International Symposium on Simulation & Serious Games 2014 (ISSSG 2014)
PC platform, higher recognition rate, popularity, and user friendliness are considered as the most important selection factors as experimental contact interface of this research. According to the survey result, Hydra interface is chosen to be an experimental contact interface. Non-Contact Interfaces Non-contact gestural interface interacts with game system by collecting motion information from multiple cameras. The gestural interfaces which control computers with users’ bodies were selected as 10 emerging technologies in 2011 at MIT (2011). It includes Kinect, iPoint3D, and LeapMotion. The most popular non-contact motion sensing interface, Kinect developed by Microsoft, is a webcam-style add-on device through which a user can interact with console/computer, e.g., Xbox 360, without contact-type controllers using their gestures and spoken commands.
Figure 5. Kinect
Fraunhofer developed iPoint3D which is equipped with stereo cameras. It allows people to communicate with a 3-D display through simple gestures – without touching it and without 3-D glasses or a data glove and also supports communications between users and systems by recognizing their finger gestures as soon as a user moves their hands. LeapMotion controller senses how users move their hands and lets them control their computers. Table 2 compares three of non-contact interfaces in terms of available platform, commercialization, preference, advantage, and disadvantage. The requirements for PC platform, wide recognition range, and various gestures from full-body lead us to select Kinect as the non-contact interface for this research. Table 2. Non-Contact Interfaces
Interfaces Platform Comm. status
Preference Advantage Disadvantage
iPoint3D PC No Low Various gestures Low recognition rate
LeapMotion MAC Yes High High recognition rate Narrow range of
recognition
Kinect PC, Xbox,
Mac Yes High
Recognized by using full-body
Unable to recognize mounted device
186
1st International Symposium on Simulation & Serious Games 2014 (ISSSG 2014)
The following two figures show how a user interact the games via Hydra and Kinect interfaces (see Figure 6 and Figure 7).
Figure 6. Game Program Test with Hydra
Figure 7. Game Program Test with Kinect
IMPLEMENTATION OF INTERFACES In this research, we first developed testbed games where the proposed interface schemes are integrated and assessed based on the customized system usability scale (SUS) statements, Brooke (1996), Bangor et al. (2008). Implementation of Testbed Games With the advent of the Wii and Kinect, many sports genre games were released and preferred by people but soon after action genre games were rolled out. Accordingly, both types of testbed games are implemented and combined with the proposed HMD-enabled interface programs.
Figure 8. Input/Output Interfaces
187
1st International Symposium on Simulation & Serious Games 2014 (ISSSG 2014)
As it is important to maximize immersion degree for the games, any constraints in getting a user input as well as in giving an output to users should be minimized. The input schemes of the traditional games are keyboard and mouse based. This environment not only causes users spatial constraints by restricting their active space to a desktop environment but also limits the source of input for interaction to wired contact interface. As a solution to this situation, a cyberspace environment has been built, allowing users’ motion such as turning head and raising hands by using previously discussed Hydra and Kinect interfaces (see Figure 8). As sports genre game, race game is implemented by using Hydra and Kinect. While playing, users are provided with three dimensional live images via HMD of which the following two stream images are shown on both displays. Based on principle of Binocular overlap, this scheme allows both displays to generate different images and to create 3D effect (see Figure 9). In this program, user motion like jumping and hand gesture is recognized through Kinect and Hydra systems.
Figure 9. Race (Sports) Game
The following figure introduces action genre game based on Kinect system (see Figure 10).
Figure 10. Action Game
User Feedback In the previous section, both testbed action and sports genre games are introduced. Each is integrated with Kinect and Hydra interfaces and reviewed by fifty users. The users of the test group consist of ten game developers, twenty game major students, and twenty ordinary participants. At first, their feedbacks with the initial version of the games are collected in terms of advantages and disadvantages. Overall, the users comment unclear UI displayed in HMD and inconveniences in motion due to wired interface schemes. Especially in case of action game,
188
1st International Symposium on Simulation & Serious Games 2014 (ISSSG 2014)
opinions associated with Kinect interface include low-level immersion degree caused by uni-directional recognition of users’ gesture and difficulties of simultaneous manipulation with gesture. Sports genre game also raises the unsatisfied immersion level caused by the uni-directional recognition of Kinect and irrelevance of the motion to the game due to insufficiently classified gestures. The following table summarizes the users’ feedback to the implemented games according to the genres and interfaces. Table 3. User Feedback
Common Issues
Unclear UI displayed in HMD
Restricted motion due to wired interfaces
Action Game
Kinect Interface
Low-level immersion degree caused by uni-directional recognition of users’ gesture
Difficulties of simultaneous manipulation with gesture
Sports Game
Kinect Interface Unsatisfied immersion level caused by the uni-directional recognition
Hydra Interface Irrelevance of the motion to the game due to insufficiently classified gestures
Interface Improvements The main objective of this research is to improve HMD-enabled interface and to increase recognition rate of users’ motion. The following table summarizes the improvements of the interfaces and their effects, especially increased interactivity between users and the game program.
Table 4. Interface Improvements
Common Issues
UI is fixed within the display of the HMD
HMD transponder is fixed on ceiling or floor of the room reduce inconveniences due to wired connection
Action Game
Kinect Interface Three Kinects recognition from all directions
Simplify available gestures rapid response
Sports Game
Kinect Interface Three Kinects recognition from all directions
Hydra Interface Attach controller into the body recognition of various gestures
As a solution to improve the degree of immersion level, the input interface of the game has been upgraded with three Kinect interfaces as in Figure 11. Each Kinect interface covers 120 degrees from three directions. The HMD repeater has been attached on either player’s body or ceiling. In addition, the relevance of the motion through Hydra has been increased by attaching one end of Hydra controller on the player’s body. The other end of the Hydra controller is in charge of getting the movement of the user. By doing so, the location information of the player has been obtained.
189
1st International Symposium on Simulation & Serious Games 2014 (ISSSG 2014)
Figure 11. Improved Kinect Interface
Motions like jump and swing are used to diversify players’ gestures. Players’ attack and evasion as well as movement are acknowledged by various gestures in combination with the height of individual joints which have been registered into the system (see Figure 12). Button based input functions of Hydra are replaced with players’ gestures by disabling all the buttons.
(a) Movement
(b) Attack I
(c) Attack II
Figure 12. Joints Classification and Related Gestures
IMPROVED USABILITY THROUGH SUS TEST Brooke said that usability is not a quality that exists in any real or absolute sense in his study on System Usability Scale (SUS), Brooke (1996). The SUS was originally developed as a “quick and dirty” survey scale that would allow the usability practitioner to quickly and easily assess the usability of a given or service, Bangor et al. (2008). The usability of the proposed interface is tested according to the empirical evaluation scheme with modified SUS statements presented by Bangor et al. (2008) based on the original SUS statements, Brooke (1996) (see Table 5).
190
1st International Symposium on Simulation & Serious Games 2014 (ISSSG 2014)
Table 5. Applied SUS Statements
No. SUS Statement
1 I think that I would like to use this product frequently
2 I found the product unnecessarily complex
3 I thought the product was easy to use
4 I think that I would need the support of a technical person to be able to use this product
5 I found that the various functions in this product were well integrated
6 I thought that there was too much inconsistency in this product
7 I would imagine that most people would learn to use this product very quickly
8 I found the product very awkward to use
9 I felt very confident using the product
10 I needed to learn a lot of things before I could get going with this product
The test group consists of ten game industry employees, twenty game major students, and twenty ordinary participants. Among them, ten game industry employees and twenty game major students were provided with two different programs written in the initial interface scheme and the revised one. On the other hand, the remaining ordinary participants tested the revised one. After then, the modified SUS statements were given to all the users. Users were supposed to choose one scale among five different levels categorized by strongly agree, agree, neutral, disagree, and strongly disagree. The scores of the individual SUS statements in Table 5 are collected by the following score mapping table, ranging from 0 upto 40 points (see Table 6).
Table 6. SUS Score Mapping
No. SUS Statement
Positive Questions
1, 3, 5, 7, 9
Strongly Agree
Agree Neutral Disagree Strongly Disagree
4 3 2 1 0
Negative Questions
2, 4, 6, 8, 10
Strongly Agree
Agree Neutral Disagree Strongly Disagree
0 1 2 3 4
The following sections describe the usability test results by comparing the SUS scores obtained from the initial interface scheme with the one from the revised one.
191
1st International Symposium on Simulation & Serious Games 2014 (ISSSG 2014)
SUS Test for Initial Version The overall SUS test’s average grade of the initial interface is 47 and the detailed distribution results are shown in the left side of Figure 13. Over 50 % of users answer that the program is awkward to use and over 70% of users assess that they need to learn a lot of things to use the program.
SUS Test after Improvement The overall SUS test average grade of the improved interface is 53 and the detailed distribution results are shown in the right hand side of Figure 13. Over 50% of users respond that they would like to use the program frequently and over 60% of users answered that the program is easy to use. Both SUS tests shows 6-point increase in the improved interface compared with the initial interface. According to this, the proposed interface enhanced the system usability as well as the system interactivity in terms of user satisfaction.
Figure 13. Interface Improvement (Comparison of the SUS Test Results)
CONCLUSION AND FUTURE WORK A lot of effort has been put into the development of the highly immersive interface to enhance the recognition rate in the motion-controlled interactive programs like video games combined with various motion controllers. This research includes the study of the existing head-mounted display (HMD)-enabled interface program and suggests improved interface scheme of HMD-enabled virtual reality (VR) games. In parallel, a user-centric VR environment has been built, where users’
192
1st International Symposium on Simulation & Serious Games 2014 (ISSSG 2014)
motions such as turning head and raising hands are recognized as various gestures in combination with the well-known contact type, Hydra, and non-contact type, Kinect system. Especially, three multiple Kinects are arranged to improve the motion recognition rate. The developed interface has been integrated with both representative testbed games which belong to action and sports genres. After then, the effectiveness and usability of the interface have been assessed according to the SUS test. The test results show that the proposed interface enhanced the system usability as well as the system interactivity in terms of user satisfaction. While working on this research, many console-oriented interfaces including Kinect and various stick style interfaces have been extended to be available on PC. The Oculus Rift is a headtracking-enabled headset but it needs to be provided with a wired controller connected to the headset due to its weightiness. Sony caught public attention to its new wearable HMZ-T3W HM in its lightweight but it does not contain headtracking feature generally required for gaming. Though motion sensing interfaces with multiple cameras improve the recognition ratio, there are still restrictions on capturing multiple users’ motion simultaneously. As was mentioned above, the upcoming HMD interface is expected to be lightweight and wireless enough to feel comfortable and maximize immersion degree of games. REFERENCES Brooks, Jr., F. P. (1999). “What’s Real About Virtual Reality?”, IEEE Computer Graphics and
Applications, 19(6), 16-27, November 1999. Sutherland, I. E. (1963). “Sketchpad A Man-Manchine Graphical Communication System”,
Proceedings-Spring Joint Computer Conference. Sutherland, I. E. (1968). “A head-mounted three dimensional display”, Proceedings of the AFIPS
Fall Joint Computer Conference, 33, 757-764. Rolland, J. & Hua, H. (2005). “Head-Mounted Display Systems”, Encyclopedia of Optical
Engineering. DOI=10.1081/E-EOE-120009801. Bolt, R. A. (1980). ““Put-That-There”: Voice and Gesture at the Graphics Interface”, Proceedings
of the 7th Annual Conference on Computer Graphics and Interactive Techniques, 262-270. 10 Emerging technologies 2011 (2011). Technology Review, MIT Technical Review. Kinect, available from: http://en.wikipedia.org/wiki/Kinect iPoint 3D, available from: http://www.fraunhofer.de/ iPoint 3D, available from: http://phys.org/news154285641.html LeapMotion, available from: https://www.leapmotion.com/product Brooke, J. (1996). “SUS – A quick and dirty usability scale”, Usability Evaluation in Industry. Bangor, A., Kortum, P. T. & Miller, J. T. (2008). “An Empirical Evaluation of the System
Usability Scale”, Int’l Journal of Human-ComputerInteraction, 24(6), 574-594. Oculus Rift, available from: http://www.oculusvr.com/rift/ Sony HMZ-T3W, available from: http://store.sony.com/wearable-hdtv-2d-3d-virtual-7.1-
surround-sound-zid27-HMZT3W/cat-27-catid-3D-Personal-Viewer?_t=pfm%3Dcategory%26pfmvalue%3Dfaceted&SR=sony_search_seo&SQ=HMZ-T3
193