acmmm2014 serious games workshop paper

6
A Natural and Immersive Virtual Interface for the Surgical Safety Checklist Training Andrea Ferracani, Daniele Pezzatini, Alberto Del Bimbo Università degli Studi di Firenze - MICC Firenze, Italy [name.surname]@unifi.it ABSTRACT Serious games have been widely exploited in medicine train- ing and rehabilitations. Although many medical simulators exist with the aim to train personal skills of medical oper- ators, only few of them take into account cooperation be- tween team members. After the introduction of the Surgical Safety Checklist by the World Health Organization (WHO), that has to be carried out by surgical team members, several studies have proved that the adoption of this procedure can remarkably reduce the risk of surgical crisis. In this paper we introduce a natural interface featuring an interactive virtual environment that aims to train medical professionals in fol- lowing security procedures proposed by the WHO adopting a ‘serious game’ approach. The system presents a realistic and immersive 3D interface and allows multiple users to interact using vocal input and hand gestures. Natural interactions between users and the simulator are obtained exploiting the Microsoft Kinect TM sensor. The game can be seen as a role play game in which every trainee has to perform the correct steps of the checklist accordingly to his/her professional role in the medical team. Categories and Subject Descriptors H.5.2 [Information Systems Applications]: Information Interfaces and Presentation—Miscellaneous ; J.3 [Computer Applications ]: Life and Medical Sciences—Health General Terms Gaming, edutainment, human factors, health Keywords Serious games, medical training, immersive environments, Surgical Safety Checklist Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full cita- tion on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or re- publish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]. SeriousGames’14, November 07 2014, Orlando, FL, USA Copyright 2014 ACM 978-1-4503-3121-0/14/11. $15.00. http://dx.doi.org/10.1145/2656719.2656725. 1. INTRODUCTION The concept of ‘serious games’ has been used since the 1960s whenever referring to solutions adopting gaming with educational purpose rather than pure players’ entertainment [1]. Among the several fields on which serious games have been exploited, medicine is one the most prolific [12] count- ing a large number of applications that feature Immersive Virtual Environments (IVEs). Improvements in medical training using gaming and IVEs have been brought out es- pecially in the field of surgical education where IVEs already play a significant role in training programmes [20][7]. The Off-Pump Coronary Artery Bypass (OPCAB) game [8] and the Total Knee Arthroplasty game [9], for exam- ple, focus on the training of decision steps in a virtual op- erating room. Serious games featuring IVEs about differ- ent topics are Pulse! [6], for acute care and critical care, CAVE TM triage training [3] or Burn Center TM for the treat- ment of burn injuries [17]. Though all these medical training simulators focus on individual skills, an important aspect of healthcare to be taken into account is that, for the most, it has to be provided by teams. Common techniques of team training in hospitals include apprenticeship, role playing and rehearsal and involve high costs due to the required per- sonnel, simulated scenarios that often lack realism, and the large amount of time needed. Several serious games fea- turing IVEs for team training have also been developed in the last years. 3DiTeams [6], CliniSpace TM [18], HumanSim [22], Virtual ED [26] and Virtual ED II [14] are some ex- amples of games in team training for acute and critical care whose main objective is to identify and reduce the weak- nesses in operational procedures. A virtual environment for training combat medics has been developed by Wiederhold and Wiederhold [25] to prevent the eventuality of post trau- matic stress disorder. Medical team training for emergency first response has been developed as systems distributed over the network by Alverson et al. [2] and Kaufman et al. [16]. IVEs have proved to be an effective educational tool. Tasks can be repeated in a safe environment and as often as re- quired. IVEs, in fact, allow to realistically experience a wide range of situations that would be impossible to replicate in the real world due to danger, complexity and impracti- cability. Though IVEs have been traditionally associated with high costs due to the necessary hardware, especially to provide realtime interactivity (multiple projectors, input devices etc.), and have always presented a difficult setup, in the last years these issues have been partially solved by the availability of cheap and easily deployable devices such as Nintendo’s Wii or Microsoft Kinect TM .

Upload: micc

Post on 19-Jul-2016

73 views

Category:

Documents


4 download

DESCRIPTION

A natural interface for the Surgical Safety Checklist Training

TRANSCRIPT

A Natural and Immersive Virtual Interface for the SurgicalSafety Checklist Training

Andrea Ferracani, Daniele Pezzatini, Alberto Del BimboUniversità degli Studi di Firenze - MICC

Firenze, Italy[name.surname]@unifi.it

ABSTRACTSerious games have been widely exploited in medicine train-ing and rehabilitations. Although many medical simulatorsexist with the aim to train personal skills of medical oper-ators, only few of them take into account cooperation be-tween team members. After the introduction of the SurgicalSafety Checklist by the World Health Organization (WHO),that has to be carried out by surgical team members, severalstudies have proved that the adoption of this procedure canremarkably reduce the risk of surgical crisis. In this paper weintroduce a natural interface featuring an interactive virtualenvironment that aims to train medical professionals in fol-lowing security procedures proposed by the WHO adopting a‘serious game’ approach. The system presents a realistic andimmersive 3D interface and allows multiple users to interactusing vocal input and hand gestures. Natural interactionsbetween users and the simulator are obtained exploiting theMicrosoft KinectTM sensor. The game can be seen as a roleplay game in which every trainee has to perform the correctsteps of the checklist accordingly to his/her professional rolein the medical team.

Categories and Subject DescriptorsH.5.2 [Information Systems Applications]: InformationInterfaces and Presentation—Miscellaneous; J.3 [ComputerApplications ]: Life and Medical Sciences—Health

General TermsGaming, edutainment, human factors, health

KeywordsSerious games, medical training, immersive environments,Surgical Safety Checklist

Permission to make digital or hard copies of all or part of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full cita-tion on the first page. Copyrights for components of this work owned by others thanACM must be honored. Abstracting with credit is permitted. To copy otherwise, or re-publish, to post on servers or to redistribute to lists, requires prior specific permissionand/or a fee. Request permissions from [email protected]’14, November 07 2014, Orlando, FL, USACopyright 2014 ACM 978-1-4503-3121-0/14/11. $15.00.http://dx.doi.org/10.1145/2656719.2656725.

1. INTRODUCTIONThe concept of ‘serious games’ has been used since the

1960s whenever referring to solutions adopting gaming witheducational purpose rather than pure players’ entertainment[1]. Among the several fields on which serious games havebeen exploited, medicine is one the most prolific [12] count-ing a large number of applications that feature ImmersiveVirtual Environments (IVEs). Improvements in medicaltraining using gaming and IVEs have been brought out es-pecially in the field of surgical education where IVEs alreadyplay a significant role in training programmes [20][7].

The Off-Pump Coronary Artery Bypass (OPCAB) game[8] and the Total Knee Arthroplasty game [9], for exam-ple, focus on the training of decision steps in a virtual op-erating room. Serious games featuring IVEs about differ-ent topics are Pulse! [6], for acute care and critical care,CAVETM triage training [3] or Burn CenterTM for the treat-ment of burn injuries [17]. Though all these medical trainingsimulators focus on individual skills, an important aspect ofhealthcare to be taken into account is that, for the most, ithas to be provided by teams. Common techniques of teamtraining in hospitals include apprenticeship, role playing andrehearsal and involve high costs due to the required per-sonnel, simulated scenarios that often lack realism, and thelarge amount of time needed. Several serious games fea-turing IVEs for team training have also been developed inthe last years. 3DiTeams [6], CliniSpaceTM [18], HumanSim[22], Virtual ED [26] and Virtual ED II [14] are some ex-amples of games in team training for acute and critical carewhose main objective is to identify and reduce the weak-nesses in operational procedures. A virtual environment fortraining combat medics has been developed by Wiederholdand Wiederhold [25] to prevent the eventuality of post trau-matic stress disorder. Medical team training for emergencyfirst response has been developed as systems distributed overthe network by Alverson et al. [2] and Kaufman et al. [16].

IVEs have proved to be an effective educational tool. Taskscan be repeated in a safe environment and as often as re-quired. IVEs, in fact, allow to realistically experience a widerange of situations that would be impossible to replicatein the real world due to danger, complexity and impracti-cability. Though IVEs have been traditionally associatedwith high costs due to the necessary hardware, especiallyto provide realtime interactivity (multiple projectors, inputdevices etc.), and have always presented a difficult setup, inthe last years these issues have been partially solved by theavailability of cheap and easily deployable devices such asNintendo’s Wii or Microsoft KinectTM.

Moreover 3D technologies used in IVEs offer several ad-vantages in medical education training. In this context it isessential to provide realistic representation of the environ-ment and several points of view in order to create in learnersthe correct mental model and to reinforce the memorabilityof a specific procedure. On the basis of the constructivismtheory [11], one of the main feature of educational IVEs isthe possibility of providing highly interactive experiences ca-pable to intellectually engage trainees in carrying out tasksand activities they are responsible for. The opportunity tonavigate educational scenarios as first person controllers al-lows learners to have direct awareness of the interaction con-text and of their responsibilities than in sessions mediatedthrough an un-related element such as a graphical user in-terface or an other symbolic representation. In this regardIVEs present a less cognitive effort in elaborating the con-text and stimulate imagination. This is even more true inscenarios where not only skills are learned in the environ-ment where these will be applied, but also the tasks can becarried out by the trainee acting in a natural way withoutthe mediation of any device (e.g. keyboard, mouse or othercontrollers). It is also pointed out by constructivists thatlearning is enhanced when gaming, group-work activity andcooperation are provided [10]. Interacting with humans ismore interesting and involving than interacting with a com-puter and it implies personal responsibility. Each learneris expected by others to account for his/her actions and tocontribute to the achievement of the team goal. The use ofgaming techniques in education is called ‘edutainment’ andit is preferred by trainees to the traditional ones because itincreases students motivation and engagement in the learn-ing context [15].

Anyway, despite the perceived advantages and the goodappreciation of serious games featuring IVEs for training,the adoption of such system is still poor in real medical fa-cilities. This is due to the fact that some open issues stillexists in systems’ usability, such as the facility to get lost dueto the possibility of free movement or the difficulty to pro-vide contextual and significant set of options in the environ-ment without switching to traditional device controlled 2Dinterfaces (panels visualizing multiple choices or drop-downmenus). In particular it is an essential negative aspect of allthe mentioned system for team training that, for the mostpart, trainees have to interact with the interface via com-puter monitors or head mounted displays. This reduces theimmersive effect and hinders the desired natural collabora-tiveness of participants in the simulation. In this regard wethink that serious games could benefit a lot adopting naturalinteraction techniques and exploiting new low-cost devicesavailable on the market for navigating and controlling IVEs.

In this paper we introduce a serious game set in an immer-sive virtual environment with the aim to train profession-als in surgery to carry out efficiently the Surgical SecurityChecklist (SSC) introduced by the World Health Organiza-tion in 2009. The system features natural interaction tech-niques and it is easily deployable requiring only a standardPC, a projector and a KinectTM sensor. The proposed sys-tem is so part of the vocational training systems where theneed is to train an activity usual in the everyday job of sur-geons. The paper is organized as follows: Sect. 2 describesin details the SSC guidelines and defines the proposed sim-ulation scenario. In Sect. 3 the architecture and the main

modules of the system are discussed along with some imple-mentation details. Finally, conclusions are drawn in Sect. 4.

2. SURGICAL CHECKLIST SIMULATIONSurgical-care is a central part of health-care throughout

the world, counting an estimated 234 million operations per-formed every year [24]. Although surgical operations areessential to improve patients health conditions, they maylead to considerable risks and complications. In 2009, theWorld Health Organization (WHO) published a set of guide-lines and best practices in order to reduce surgical compli-cations and to enhance team-work cooperation [13]. TheWHO summarised many of these recommendations in theSurgical Safety Checklist, shown in Fig. 1. Starting formthe proposed guidelines, many hospitals have implementedtheir own version of the SSC in order to better match re-quirements of internal procedures.

The SSC identifies three main distinct phases correspond-ing to a specific period during the execution of an operation:“before the induction of anaesthesia”, “before the incision ofthe skin”, “before the patient leaves the operating room”.In each phase, the surgical team has to complete the listedtasks before it proceeds with the procedure. All the actionsmust be verified by a checklist coordinator who is in chargeto guarantee the correctness of the procedure. The goal is toensure patient safety checking machinery state and patientconditions, verifying that all the staff members are identifi-able and accountable, avoiding errors in patient identity, siteand type of procedure. In this way, risks endangering thewell-being of surgical patients can be efficiently minimized.

Gawande et al. [4] conducted several simulations of a sur-gical crisis scenario in order to assess the benefits obtainedby the adoption of the SSC. Results have shown that theexecution of the SSC improves medical team’s performanceand that failure to adhere to best practices during a surgicalcrisis is remarkably reduced.

Figure 1: The Surgical Safety Checklist proposed bythe WHO in 2009.

The proposed system is a serious game featuring an IVEto train and practice users in the accomplishment of the sur-gical checklist. It adopts natural interaction via gestures andvoice. The system de facto acts as the ‘checklist coordinator’of the surgical team.

The simulation involves multiple trainees (up to three),each of them associated with his/her role, and guide themthroughout a complete SSC in a surgical operation scenario.

Three actors (surgeon, anesthesiologist and the nurse), ex-pected to complete the checklist, are automatically detectedby the system when approaching the IVE and can interactgesturing and speaking as in the real life. Interactions andchoices are not mediated by haptic devices or other con-trollers but mimic how the real procedure should be.

2.1 Simulation scenarioThe virtual simulation allows the three health profession-

als who are going to execute a surgical operation to com-plete the SSC, with respect of their professional role. TheIVE was designed with the objective to help the trainees tounderstand the correct procedures to be followed.

Professionals (i.e. trainees) stand in front of the simu-lation interface (see Fig. 2). The simulator associate userswith a professional role on the basis of their position in thephysical space. In practice, the user standing on left will beassociated with the anesthesiologist, the one in the centrewill be the surgeon and one on the right will be the nurse.

Figure 2: Avatar selection: the trainee can choosewhich role to enact standing on the left, on the cen-tre or on the right of the IVE.

Once every user is associated with a role, the IVE is shownand the simulation can start. The first environment repre-sents the pre-operating room, where usually the “before theinduction of anaesthesia” part of the SSC takes place, withthe patient and the three professionals’ avatars (see Fig. 3).From this moment, user can take control of the simulationby taking a step ahead. When one of the user has control,the environment is shown from his/her first-person point ofview (POV). Hence, the active user can interact via voiceor gesture in order to carry out the specific step of the SSCprocedure.

Interactions can be performed both by voice and handgestures. Voice-based interactions are used during the SSCwhen one of the professionals is expected to communicatewith the patient or with another team’s member. For in-stance, one step of the SSC simulation contemplates thenurse confirming to other team members the site of the op-eration, let’s imagine it to be the right arm. Accordingly,the user enacting the nurse should take a step ahead andsay something like “we are going to operate the right arm”or a similar sentence. As soon as the sentence is pronounced,

Figure 3: A 3D pre-operating room environmentfrom a first-person POV, showing the patient andother professionals’ avatars.

the system verifies if the site of the operation is correct, itupdates the SSC and it gives feedback in order to continuewith the simulation.

Hand gestures instead (e.g. hand pointing and push) areused for other types of interaction, such as to touch thepatient, to check the state of the medical equipment or toactivate virtual menus. In the“before the induction of anaes-thesia” part of the procedure the nurse has to indicate withhis/her hand the part of the body to be operated. The IVEdisplays the patient with a view from above lying on the bedin order to allow the trainee a better precision of movementin the 3D space. Contextually an hand pointer is shown,mapped to the real position of the nurse’s hand, which al-lows him/her to select the body part as shown in Fig. 4.

Figure 4: The nurse indicates the patient’s bodypart to be operated pointing his/her hand.

Furthermore, during the simulation, the active user canperform a swipe gesture with his/her hand to activate a vir-tual overlay containing all the information about the pa-tient clinic history and the status of the SSC to be carriedout. The overlay simulates the patient’s medical card, usu-

ally available in the operation room. When all the steps ofthe checklist are correctly performed, feedback is given totrainees of the good outcome of the simulation and a sum-mary is presented with the time spent in completing theprocedure and the recording of all the errors committed byeach user.

The system exploits these data in order to provide thesimulation with some gaming aspects: 1) the team mem-bers share the goal to correctly complete the checklist in theshortest time (the system provides a ranking table of thebest team scores); 2) each trainee competes with the otherteam members and his performance is measured by a scor-ing system which keeps track of all the individual errors andshow the results at the end of the simulation.

Although the SSC have been strictly defined and formalisedby the WHO at an high level, the ‘content’ of the simula-tion sessions (i.e. the patient’s anamnesis and clinic card) isfully configurable in the system. This means that instruc-tors can simulate, and trainees experience, all possible op-erations and risks, and therefore that specific scenarios canbe created for teams of medical professionals in every field.

3. THE SYSTEMThe proposed simulation system is a natural interface

that exploits a realistic 3D interface and natural interactionparadigms in order to train users to correctly execute theSSC. Users stand in front of a large-sized screen or projec-tion and can interact without using any wearable or hand-held device. Contrariwise they can use their voice and bodyas interface controller. The system is composed by two mainmodules:

• The Interaction Detection Module (IDM).

• The Immersive 3D Interface and the associated GameStatus Controller (GSC).

Figure 5 shows how these modules are organized from a log-ical point of view in the simulator.

The remainder of this section describes in details the mod-ules. Finally, some technical implementation details are pro-vided.

Figure 5: Logical system architecture: the IDManalyses motion and depth data coming from theKinectTM sensor, it detects actions and communi-cates with the GSC that updates the IVE on thebasis of the scenario’s configuration.

3.1 Interaction DetectionThe interaction between users and the IVE is obtained

tracking movements and identifying users’ actions by ex-ploiting the Microsoft KinectTM sensor [27]. In particular,the interface can be controlled by trainees using their po-sition in the physical space, hand gestures and voice (seeFig. 6). The three different types of interaction can be ex-ecuted concurrently or are turned off/on depending on thephase of the simulated procedure. For instance, if the sim-ulator is waiting for a gesture input from the active user,the speech recognition module is temporarily switched off.The IDM is responsible of recognizing and notifying the useractions to the GSC module in order to proceed with the sim-ulation.

Figure 6: Different types of interaction the systemis able to detect.

In details, the IDM is able to detect:

Active user. The KinectTM sensor is able to identity upto six human figures standing in front of the cam-era, but it can only track two skeletons simultaneously.Since the system is designed for three users, a policyis needed to dynamically define which of them is con-trolling the simulation (i.e. the interface). From thedepth map obtained by the sensor, the IDM detectswhich user is closer to the interface. When a traineeperforms one step ahead, resulting in a reduction of thedistance on the z-axis, the module notifies a change ofthe active user. This detection is active during all thesimulation session.

Hand gestures. Once the active user is identified, skeletontracking is exploited to detect his movements and, inparticular, to track his/her hand position in the space.The hand position is used to map an hand pointer inthe IVE used to interact with interface elements. TheIDM tracks the active hand of the trainee and sends itsspatial coordinates in the 3D space in order to updatethe interface. When the user needs to ‘activate’ somevirtual element on the interface, he/she must performa push gesture with the open hand. This is somehowsimilar to a click on a mouse-based interface. Handstracking and gesture updates can be switched on/offby the GSC, depending on the phase of the simulation.Furthermore, a swipe gesture has been provided thatallows the user to open a virtual 2D overlay on theinterface containing the patient case history and cliniccard. This gesture is performed by moving the rightarm from the right to the left.

Speech inputs. The Kinect features an array of four sepa-rate microphones spread out linearly at the bottom of

the sensor. By comparing each microphone responseto the same audio signal, the microphone array can beexploited to determine the direction from which thesignal is coming and therefore to pay more attentionto the specific sound in that direction rather than inanother. The module checks if the angle from wherethe vocal input is detected corresponds to the directionwhere the active user is, in order to ignore unexpectedspeech form other users. So, when the IDM has iden-tified the active user, it tries to understand his/herspecific audio signal. In order to achieve this, back-ground noise removal algorithms are applied [23]. TheMicrosoft Speech SDK is used to verify the correctnessof trainees’ answers when interacting via voice with thesystem. In particular, it is exploited to asses whetheruser’s vocal input corresponds to a correct value for thecurrent SSC step. The GSC dynamically loads sets ofpossible keywords that are correct for the current stepfrom the Scenario Configuration file. Based on thesesets, the module checks the audio input and computesa confidence value for the ‘speech-to-text’ match. If thedetected confidence is greater than a threshold value,the correct interaction is notified to the GSC and thesimulation can continue to the next step.

3.2 Game Status ControllerThe GSC is the module responsible of the logic behind the

whole simulation system. On simulation start-up the con-troller parses an external Scenario Configuration file con-taining all the phases and correct answers for the train-ing session. On the basis of the configuration, it initializesthe 3D interface setting up the environment (e.g. the pre-operating room) and the characters. For each simulationphase and each SSC procedure step, it communicates to theIDM which type of interaction must be detected. In caseof vocal input, the controller also evaluates a set of correctkeywords to be recognised in the current phase of the simu-lation. The GSC receives interactions’ information and usesthem to verify if the actions are correct or wrong accordingto the Scenario Configuration and the SSC. The controllerthen updates the interface in order to give positive/negativefeedback to trainees and, consequently, to trigger animationsand 3D cameras/points-of-view changes. All the actions anderrors are recorded and stored in order to provide teams’ andtrainee’s performance feedback.

3.3 Implementation detailsThe system is composed by two modules: the GSC/IVE

and the IDM. The GSC module and the 3D immersive inter-face have been developed within the Unity3D1 environmentusing both C# and JavaScript programming languages toadd interactivity support. The controller is in charge toload and parse a Scenario Configuration file that containsthe SSC procedure structure, the patient case history andclinic card in a JSON syntax. The IDM exploits MicrosoftKinectTM SDK to detect users’s interactions. It is developwith C# language and uses a network socket to communi-cate with the main program. Scenes and characters havebeen created with Autodesk Maya. The system can run onevery normal Windows based workstation.

1http://unity3d.com/

4. CONCLUSIONSIn this paper we present a work in progress immersive

virtual environment, featuring gaming techniques, designedto train medical operators and professionals in the adoptionand in the correct execution of the procedures suggested bythe WHO in the Surgical Safety Checklist. The importanceof the adoption of the SSC during surgical operations hasbeen proved by several studies, showing improvements inmedical team’s performance and in the reduction of surgi-cal crisis. The main objective of the proposed system istherefore the design and implementation of a natural andimmersive interface, sufficiently realistic and easy-to-use, tobe actually adopted in medical education courses of study.In the immersive interface there’s no possibility of free nav-igation because the environment is controlled and exploitedas a means for the trainee to play his/her role in the gameand to follow a strictly defined procedure. Trainees are re-quired to decide who is in charge of carrying out each of theactivities of the SSC and this improves awareness and senseof responsibility.

According to its functionalities the developed system canbe described using the taxonomy proposed by Wattanasoon-torn et al., as shown in Table 1. Wattanasoontorn et al.extended the classification system defined by Rego et al.[19] who identified a taxonomy of criteria for the classifi-cation of serious games for health (application area, inter-action technology, game interface, number of players, gamegenre, adaptability, progress monitoring, performance feed-back and game portability).

Table 1: System classification by functionality ac-cording to Wattanasoontorn’s taxonomy.

Functionality Solution

Application area Cognitive

Interaction Technology Microsoft Kinect

Game Interface 3D

Number of Players Multiplayer (up to 3 roles)

Game Genre Role Play

Adaptability No

Performance Feedback Yes

Progress monitoring No

Game portability Yes

Game Engine Unity 3D / Kinect SDK

Platform PC

Game Objective Professionals training

Connectivity No

During the entire design and development process of theserious game, medical professionals have been involved inorder to reproduce scenarios and procedures that are highly

compliant with real surgical operations. Future works willinclude several assessments of the system exploiting bothusability tests [5] and heuristic evaluations of the naturaland immersive virtual interface [21].

5. REFERENCES[1] C. C. Abt. Serious games [by] Clark C. Abt. Viking

Press New York, 1970.

[2] D. C. Alverson, S. S. Jr, T. P. Caudell, K. Summers,Panaiotis, A. Sherstyuk, D. Nickles, J. Holten,T. Goldsmith, S. Stevens, K. Kihmm, S. Mennin,S. Kalishman, J. Mines, L. Serna, S. Mitchell,M. Lindberg, J. Jacobs, C. Nakatsu, S. Lozanoff, D. S.Wax, L. Saland, J. Norenberg, G. Shuster, M. Keep,R. Baker, H. S. Buchanan, R. Stewart, M. Bowyer,A. Liu, G. Muniz, R. Coulter, C. Maris, and D. Wilks.Distributed immersive virtual reality simulationdevelopment for medical education. Journal ofInternational Association of Medical ScienceEducators, 15(1), 2005.

[3] P. B. Andreatta, E. Maslowski, S. Petty, W. Shim,M. Marsh, T. Hall, S. Stern, and J. Frankel. Virtualreality triage training provides a viable solution fordisaster-preparedness. Academic emergency medicine,17(8):870–876, 2010.

[4] A. F. Arriaga, A. M. Bader, J. M. Wong, S. R.Lipsitz, W. R. Berry, J. E. Ziewacz, D. L. Hepner,D. J. Boorman, C. N. Pozner, D. S. Smink, et al.Simulation-based trial of surgical-crisis checklists. NewEngland Journal of Medicine, 368(3):246–253, 2013.

[5] D. A. Bowman, J. L. Gabbard, and D. Hix. A surveyof usability evaluation in virtual environments:classification and comparison of methods. Presence:Teleoperators and Virtual Environments,11(4):404–424, 2002.

[6] M. W. Bowyer, K. A. Streete, G. M. Muniz, and A. V.Liu. Immersive virtual environments for medicaltraining. In Seminars in Colon and Rectal Surgery,volume 19, pages 90–97. Elsevier, 2008.

[7] D. A. Cook, R. Hatala, R. Brydges, B. Zendejas, J. H.Szostek, A. T. Wang, P. J. Erwin, and S. J. Hamstra.Technology-enhanced simulation for health professionseducation: a systematic review and meta-analysis.Jama, 306(9):978–988, 2011.

[8] B. Cowan, H. Sabri, B. Kapralos, F. Moussa,S. Cristancho, and A. Dubrowski. A serious game foroff-pump coronary artery bypass surgery proceduretraining. In MMVR, pages 147–149, 2011.

[9] B. Cowan, H. Sabri, B. Kapralos, M. Porte,D. Backstein, S. Cristancho, and A. Dubrowski. Aserious game for total knee arthroplasty procedure,education and training. Journal of CyberTherapy &Rehabilitation (JCR), 3(3), 2010.

[10] C. Dede. The evolution of constructivist learningenvironments: Immersion in distributed, virtualworlds. Educational technology, 35(5):46–52, 1995.

[11] T. M. Duffy and D. H. Jonassen. Constructivism: Newimplications for instructional technology.Constructivism and the technology of instruction: Aconversation, pages 1–16, 1992.

[12] M. M. Hansen. Versatile, immersive, creative anddynamic virtual 3-d healthcare learning environments:

a review of the literature. Journal of Medical InternetResearch, 10(3), 2008.

[13] A. B. Haynes, T. G. Weiser, W. R. Berry, S. R.Lipsitz, A.-H. S. Breizat, E. P. Dellinger, T. Herbosa,S. Joseph, P. L. Kibatala, M. C. M. Lapitan, et al. Asurgical safety checklist to reduce morbidity andmortality in a global population. New EnglandJournal of Medicine, 360(5):491–499, 2009.

[14] W. L. Heinrichs, P. Youngblood, P. M. Harter, andP. Dev. Simulation for team training and assessment:case studies of online training with virtual worlds.World Journal of Surgery, 32(2):161–170, 2008.

[15] J. G. Hogle. Considering games as cognitive tools: Insearch of effective” edutainment.”. 1996.

[16] M. Kaufman. Team training of medical firstresponders for cbrne events using multiplayer gametechnology. In Proceedings of Medicine Meets VirtualReality, 2006.

[17] S. N. Kurenov, W. W. Cance, B. Noel, and D. W.Mozingo. Game-based mass casualty burn training.Studies in health technology and informatics,142:142–144, 2008.

[18] D. Parvati, W. L. Heinrichs, and Y. Patricia.Clinispace: A multiperson 3d online immersivetraining environment accessible through a browser.Medicine Meets Virtual Reality 18: NextMed, 163:173,2011.

[19] P. Rego, P. M. Moreira, and L. P. Reis. Serious gamesfor rehabilitation: A survey and a classificationtowards a taxonomy. In Information Systems andTechnologies (CISTI), 2010 5th Iberian Conferenceon, pages 1–6. IEEE, 2010.

[20] H. W. Schreuder, G. Oei, M. Maas, J. C. Borleffs, andM. P. Schijven. Implementation of simulation insurgical practice: minimally invasive surgery has takenthe lead: the dutch experience. Medical teacher,33(2):105–115, 2011.

[21] A. Sutcliffe and B. Gault. Heuristic evaluation ofvirtual reality applications. Interacting withcomputers, 16(4):831–849, 2004.

[22] J. M. Taekman and K. Shelley. Virtual environmentsin healthcare: immersion, disruption, and flow.International anesthesiology clinics, 48(3):101–121,2010.

[23] J. Webb and J. Ashley. Beginning Kinect Programmingwith the Microsoft Kinect SDK. Apress, 2012.

[24] T. G. Weiser, S. E. Regenbogen, K. D. Thompson,A. B. Haynes, S. R. Lipsitz, W. R. Berry, and A. A.Gawande. An estimation of the global volume ofsurgery: a modelling strategy based on available data.The Lancet, 372(9633):139–144, 2008.

[25] B. K. Wiederhold and M. Wiederhold. Virtual realityfor posttraumatic stress disorder and stressinoculation training. Journal of Cybertherapy &Rehabilitation, 1(1):23–35, 2008.

[26] P. Youngblood, P. M. Harter, S. Srivastava,S. Moffett, W. L. Heinrichs, and P. Dev. Design,development, and evaluation of an online virtualemergency department for training trauma teams.Simulation in Healthcare, 3(3):146–153, 2008.

[27] Z. Zhang. Microsoft kinect sensor and its effect.MultiMedia, IEEE, 19(2):4–10, 2012.