[poster] holographic iray: exploring augmentation for ...[poster] holographic iray: exploring...

3
[POSTER] Holographic iRay: Exploring Augmentation for Medical Applications Tian Xie 1 , Mohammad M. Islam 1 , Alan B. Lumsden 2 , Ioannis A. Kakadiaris 1,* 1 Computational Biomedicine Lab, Dept. of Computer Science, Univ. of Houston, 2 Methodist Research Institute, Houston, USA Figure 1: Overview of the Holographic iRay prototype: (a) selection view; (b) AR view, and (c) observation view. ABSTRACT A Holographic iRay prototype focusing on medical augmented reality is presented. The prototype is built using Microsoft HoloLens and Unity engine based on a previous iRay system built for iPad. A human subject is scanned using magnetic resonance imaging and the torso surface is pre-operatively segmented for 3D registration. The registration is performed using the iterative closest point algorithm between the pre-operative torso surface and the active torso surface mesh provided by the HoloLens spatial mapping and the gaze interaction. A scanning box is visualized at the gazing point to help the user select the targeting area. The pre- operative torso surface and the sampled active vertices within the scanning box are additionally overlaid in the box as the auxiliary information for guidance. Several simple interactions are designed to control the rendering of the inner organs after the registration. The experimental results demonstrate the potential of the Holographic iRay for medical applications. Keywords: Medical AR, Markerless Registration, ICP, HMD, HoloLens, Spatial Mapping. Index Terms: H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems – Artificial, augmented, and virtual realities; H.5.2 [Information Interfaces and Presentation]: User Interfaces – Interaction styles; I.3.6 [Computer Graphics]: Methodology and Techniques – Interaction techniques; J.3 [Life and Medical Sciences] 1 INTRODUCTION This work is a holographic exploration of the iRay system [1] from mobile devices to a head-mounted display (HMD) such as Microsoft HoloLens [2]. The iRay system is a mobile Augmented Reality (AR) application implemented on the iPad [3] with a high precision depth sensor (Structure Sensor [4]). It registers the pre- operative torso surface to the active torso surface scanned by the depth sensor and tracks the camera using a simultaneous localization and mapping (SLAM) method provided by Structure SDK [4]. The previous iRay system offers limited immersive experience due to the handheld 2D display and non-spatial screen-touch interactions. These limitations motivated us to explore the capability of iRay on HMD. In this paper, we chose HoloLens to implement the Holographic iRay considering the integrated function of surface reconstruction provided by spatial mapping [5]. Therefore, the reconstructed mesh is being used with the same registration strategy. HoloLens has been released for only one year, but has impressed the public with its inside-out tracking system and independent computational capability. Several holographic applications [6-9] have been developed on HoloLens for medical applications. However, most do not have sufficient registration with the human subject, but only subjectively place the holograms on the ground, table, or in the air for education purposes. One reason for this phenomenon is that the HoloLens does not provide an integrated registration method. It automatically tracks head pose and position, but does not provide tracking of individual objects by default. Thus, the registration technique remains challenging, as well as the traditional AR. Scopis recently launched a Holographic Navigation Platform [10] for surgical use. It uses an external 3D positional tracking system based on a stereoscopic infrared camera with markers. In contrast, the Holographic iRay prototype presented in this paper does not require any additional sensors but only the HoloLens. The markerless registration is achieved with the help of spatial mapping, which reconstructs the environment surfaces. The experiments demonstrate this prototype to be effective for the tasks explored and easy-to-use. 2 IMPLEMENTATION The Holographic iRay follows the system structure of the previous iRay system [1]. The offline pre-operative phase consists of medical scan (MRI/CT), segmentation of torso surface and inner organs, and model creation. The online system scans the environment and human body, obtains the 3D points located in the target area selected by the user’s gaze, registers the pre-operative surface to this point cloud, renders the inner organs based on the registration result, and responds to the user’s interactions. * Corresponding Author: [email protected]

Upload: others

Post on 25-May-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

[POSTER] Holographic iRay: Exploring Augmentation for Medical Applications

Tian Xie1, Mohammad M. Islam1, Alan B. Lumsden2, Ioannis A. Kakadiaris1,*

1Computational Biomedicine Lab, Dept. of Computer Science, Univ. of Houston, 2Methodist Research Institute, Houston, USA

Figure 1: Overview of the Holographic iRay prototype: (a) selection view; (b) AR view, and (c) observation view.

ABSTRACT A Holographic iRay prototype focusing on medical augmented reality is presented. The prototype is built using Microsoft HoloLens and Unity engine based on a previous iRay system built for iPad. A human subject is scanned using magnetic resonance imaging and the torso surface is pre-operatively segmented for 3D registration. The registration is performed using the iterative closest point algorithm between the pre-operative torso surface and the active torso surface mesh provided by the HoloLens spatial mapping and the gaze interaction. A scanning box is visualized at the gazing point to help the user select the targeting area. The pre-operative torso surface and the sampled active vertices within the scanning box are additionally overlaid in the box as the auxiliary information for guidance. Several simple interactions are designed to control the rendering of the inner organs after the registration. The experimental results demonstrate the potential of the Holographic iRay for medical applications.

Keywords: Medical AR, Markerless Registration, ICP, HMD, HoloLens, Spatial Mapping.

Index Terms: H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems – Artificial, augmented, and virtual realities; H.5.2 [Information Interfaces and Presentation]: User Interfaces – Interaction styles; I.3.6 [Computer Graphics]: Methodology and Techniques – Interaction techniques; J.3 [Life and Medical Sciences]

1 INTRODUCTION This work is a holographic exploration of the iRay system [1] from mobile devices to a head-mounted display (HMD) such as Microsoft HoloLens [2]. The iRay system is a mobile Augmented Reality (AR) application implemented on the iPad [3] with a high precision depth sensor (Structure Sensor [4]). It registers the pre-operative torso surface to the active torso surface scanned by the

depth sensor and tracks the camera using a simultaneous localization and mapping (SLAM) method provided by Structure SDK [4].

The previous iRay system offers limited immersive experience due to the handheld 2D display and non-spatial screen-touch interactions. These limitations motivated us to explore the capability of iRay on HMD. In this paper, we chose HoloLens to implement the Holographic iRay considering the integrated function of surface reconstruction provided by spatial mapping [5]. Therefore, the reconstructed mesh is being used with the same registration strategy.

HoloLens has been released for only one year, but has impressed the public with its inside-out tracking system and independent computational capability. Several holographic applications [6-9] have been developed on HoloLens for medical applications. However, most do not have sufficient registration with the human subject, but only subjectively place the holograms on the ground, table, or in the air for education purposes.

One reason for this phenomenon is that the HoloLens does not provide an integrated registration method. It automatically tracks head pose and position, but does not provide tracking of individual objects by default. Thus, the registration technique remains challenging, as well as the traditional AR.

Scopis recently launched a Holographic Navigation Platform [10] for surgical use. It uses an external 3D positional tracking system based on a stereoscopic infrared camera with markers.

In contrast, the Holographic iRay prototype presented in this paper does not require any additional sensors but only the HoloLens. The markerless registration is achieved with the help of spatial mapping, which reconstructs the environment surfaces. The experiments demonstrate this prototype to be effective for the tasks explored and easy-to-use.

2 IMPLEMENTATION The Holographic iRay follows the system structure of the previous iRay system [1]. The offline pre-operative phase consists of medical scan (MRI/CT), segmentation of torso surface and inner organs, and model creation. The online system scans the environment and human body, obtains the 3D points located in the target area selected by the user’s gaze, registers the pre-operative surface to this point cloud, renders the inner organs based on the registration result, and responds to the user’s interactions.

* Corresponding Author: [email protected]

Figure 2: Examples of visibility control.

Figure 1 depicts the three major views of the Holographic iRay prototype. Figure 1(a) depicts the selection view before the registration. Under this view, the reconstructed surfaces generated by the HoloLens spatial mapping are rendered in white wireframe. A red circle cursor is drawn at the 3D point where the gaze direction actively hits the reconstructed surfaces. Around the cursor, a transparent white scanning box is simultaneously rendered to emphasize the region of interest (ROI) in 3D. The registration will be applied on the vertices within the scanning box and the vertices sampled from the pre-operative torso mesh. Therefore, the size of the box is defined by the 3D size of the pre-operative torso surface, and the box is being kept parallel to the ground. The preview of the vertices located inside the scanning box is actively visualized as several green spheres. The maximum number of these preview spheres is set to 50 to guarantee real-time performance. To guide the user to better select the ROI, the pre-operative torso surface is overlaid in the middle of the scanning box. This guidance strategy is demonstrated to be effective by a user study in our previous work [11].

According to the above settings, the selection of active vertices is only controlled by the gaze interaction, which is provided by HoloLens and Unity [12] components. When the user is satisfied with the selection and is ready to register the surface, a simple voice command “Match” can launch the registration. The registration is achieved by the iterative closest point (ICP) algorithm [13], which is built as a dynamic link library inside the application, in real time.

Figure 1(b) depicts the AR view after registration. The holograms of the torso surface and inner organs are all rendered based on the transformations (6 degrees of freedom) computed by the ICP. The rendering of spatial mapping meshes is stopped. Meanwhile, the meshes stop to interact with the 3D cursor.

Figure 1(c) depicts the observation view where the user can observe the details of the organs of interest as floating holograms beside the patient. The user can use air-tap gestures [14] or say “Detail” to activate this view.

Under the AR view and observation view, the user can also use the air-tap or voice commands such as “Hide” and “Show” to control the visibility of the selected organs. Figure 2 depicts two examples of the results of visibility control.

In the current prototype, the detailed holograms are in double scale and not movable. However, it is easy to add more functions in the future to control the detailed holograms and add additional auxiliary information (e.g., text, images, or animations).

3 RESULTS AND DISCUSSION Figure 3 depicts the augmentation results from various perspectives. The tracking by HoloLens is robust to large slant and motion blur if the human subject keeps still, but small trembling occurs when holograms are too close to the eyes. Additionally, the holograms have dynamic bias relative to the camera position, which may be caused by camera distortion or tracking limitations.

The Holographic iRay works in real time. The precision of HoloLens spatial mapping is set to 5,000 triangles per cubic meter though it may not meet the exact number. The time between mesh updates is set to 2.5 s to allow re-computation of deforming surfaces. This combination of parameters is suitable for the current

prototype, but it has not been tested or verified to represent the best performance.

The ICP registration works in real time. Both pre-operative and active point clouds of vertices are sampled to less than 100 points before registration. The design of the scanning box additionally improves the registration speed, because during the selection step, the pre-operative torso surface has been placed very near the active surface, which can be considered as a coarse match unconsciously implemented by the user.

The accuracy of the active augmentation cannot be physically measured because the ground truth of camera pose is not currently available. Fiducial markers or some other additional sensors are necessary if a quantitative measurement is required. A 3 cm translation in the direction of gravity is currently added to coarsely compensate the 3D reconstruction error of HoloLens. A calibration is required in the future work to estimate the exact displacement of spatial mapping. However, the raw depth information would be preferred if HoloLens can provide it in the future.

The vivid immersive experience and convenient interactions are validated by several different users, which demonstrates the Holographic iRay to be effective and efficient for the tasks explored.

Future work will focus on accuracy improvement and additional interactions. More experiments will be performed to compare the different registration methods for the tasks explored. Tracking of a moving human body will be studied, as currently the subject must be still, which is limited by the SLAM tracking. Furthermore, user studies will be performed to improve the immersive experience depending on the specific scenario.

4 CONCLUSION In this paper, we present a Holographic iRay prototype implemented on the HoloLens. The prototype explores the capability of our previous iRay system on the latest high-end HMD for medical applications. Spatial mapping surfaces are applied to register the torso surface of a human subject. The good immersive experience and simple interactions are validated by several users. This work expands the usability of HoloLens in the medical domain.

ACKNOWLEDGMENTS This research was funded in part by the Methodist Research Institute and the UH Hugh Roy and Lillie Cranz Cullen Endowment Fund. All statements of fact, opinion or conclusions contained herein are those of the authors and should not be construed as representing the official views or policies of the sponsors.

REFERENCES [1] I. A. Kakadiaris, M. M. Islam, T. Xie, C. Nikou, and A. B. Lumsden,

"iRay: mobile AR using structure sensor," In Proc. 15th IEEE International Symposium on Mixed and Augmented Reality, Merida, Mexico, 2016, pp. 127-128.

[2] Microsoft Corp. HoloLens. Available: https://www.microsoft.com/ en-us/hololens

[3] Apple Inc. iPad. Available: https://www.apple.com/ipad/ [4] Occipital Inc. Structure Sensor. Available: https://structure.io/ [5] Microsoft HoloLens. Spatial mapping. Available: https://developer.

microsoft.com/en-us/windows/mixed-reality/spatial_mapping [6] Microsoft HoloLens. Partner Spotlight with Case Western Reserve

University. Available: https://www.youtube.com/watch?v=SKpKlh1 -en0

[7] MediSIM. Medical Simulated Interactive Manikin. Available: http://www.etc.cmu.edu/projects/medisim/

[8] HoloEyes Inc. HoloLens Mixed Reality Surgery: holographic augmented mixed reality navigation. Available: https://www.you tube.com/watch?v=qLGD570I1OE

[9] ANIMA RES GmbH. INSIGHT HEART on HoloLens. Available: https://www.youtube.com/watch?v=c9O7L1Wtqqc&list=PLKFqg-NQRGaIR_mcetDGh7qZn2k4XklCB

[10] Scopis GmbH. Holographic Navigation Platform Available: http://holo.scopis.com/

[11] T. Xie, M. M. Islam, A. B. Lumsden, and I. A. Kakadiaris, "Semi-automatic Initial Registration for the iRay System: A User Study," In Proc. 4th International Conference on Augmented Reality, Virtual Reality and Computer Graphics, Ugento, Lecce, Italy, 2017, pp. 33-42.

[12] Unity. Available: https://unity3d.com/ [13] A. Geiger, P. Lenz, and R. Urtasun, "Are we ready for autonomous

driving? the kitti vision benchmark suite," In Proc. 25th IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 2012, pp. 3354-3361.

[14] Microsoft HoloLens. Gestures. Available: https://developer.micro soft.com/en-us/windows/mixed-reality/gestures

Figure 3: Augmentation results from various perspectives with motion blur. The first row depicts the augmentation results under motion blur when camera is close to the subject. The second row depicts the results with big slants when the camera is moving farther or even not

looking at the registered torso. The third row depicts the results under large motion blur.