collaborative immersive analytics

Post on 21-Jan-2018

438 Views

Category:

Technology

2 Downloads

Preview:

Click to see full reader

TRANSCRIPT

COLLABORATIVE IMMERSIVE ANALYTICS

Mark Billinghurstmark.billinghurst@unisa.edu.au

November 7th 2017Workshop on Immersive Analytics

International Symposium on Big Data AnalyticsAdelaide, Australia

Visualization Today

Collaborative VisualizationThe shared use of computer-supported, (interactive,) visual representations of data by more than one person with the common goal of contribution to joint information processing activities.

Isenberg, P., Elmqvist, N., Scholtz, J., Cernea, D., Ma, K. L., & Hagen, H. (2011). Collaborative visualization: definition, challenges, and research agenda. Information Visualization, 10(4), 310-326.

Petra Isenberg, 2011

Example of Collaborative Visualization

Multi-disciplinary Research

• Collaborative Visualisation related to:• Scientific Visualisation• Information Visualisation• Visual Analytics

• All well established fields over last 20 years• But little existing research in Collaborative Visualisation

• E.g. from 1990 – 2010, over 1600 papers in main Viz conferences• Only 34 papers published in Collaborative Visualisation, ~2%

Collaborative Immersive Analytics (CIA) The shared use of new immersive interaction and display technologies by more than one person for supporting collaborative analytical reasoning and decision making.

• Key properties• Use of immersive technologies• Computer supported collaboration• Analytical reasoning and decision making

Immersive Technologies

Milgram’s Reality-Virtuality continuum

Mixed Reality

Reality - Virtuality (RV) Continuum

RealEnvironment

AugmentedReality (AR)

AugmentedVirtuality (AV)

VirtualEnvironment

"...anywhere between the extrema of the virtuality continuum."

P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.

Collaborative Immersive Analytics

• Relationship to Mixed Reality, Visual Analytics, CSCW

CIA Example: EVL CAVE2

Marai, G. E., Forbes, A. G., & Johnson, A. (2016, March). Interdisciplinary immersive analytics at the electronic visualization laboratory: Lessons learned and upcoming challenges. In Immersive Analytics (IA), 2016 Workshop on (pp. 54-59). IEEE.

CAVE2 Hardware and Software

• Hardware• 36 computers drive 72 LCD panels – 320 degree display• 74 megapixel 2D / 37 megapixel passive 3D hybrid reality environment• 14 Vicon tracking cameras – tracking up to 6 people/objects, 20.2 audio

• Software• OmegaLib – open source software driving CAVE2, rendering, input, etc.• SAGE2 – browser based collaboration/interaction platform• Support for hybrid devices + multi-user input

SAGE 2 Software Platform

• https://www.youtube.com/watch?v=V9zGmQpaRUU

ENDURANCE Case Study (2 Days)

• Working with NASA team • explore Lake Bonney in the McMurdo Dry Valleys of Antarctica• ice covered, used Auto. Underwater Vehicle to collect sonar data

• CAVE2 used as hybrid CIA system• CAVE walls – shared data representation• Laptops on tables – private workshop, individual data analysis• Shared VR data exploration – virtually swimming through data• Subgroups form at large screen to analyze data

“.. the team got more done in 2 days than in 6 months of email, Skype, and Google Hangout.”

Types of CIA systems

• Classify system using CSCW Space-Time Taxonomy

1. Co-Located Synchronous Collaboration

• Same time/Same place collaboration• E.g. CAVE2, shared tables, interactive walls

• Advantages• Shared awareness, use external tools (laptop, notes)• Easy moving between individual and group work

Reality VirtualityAugmented Reality (AR)

Augmented Virtuality (AV)

AR/VR Example: The MagicBook

The MagicBook

• Using AR to transition along Milgram’s continuum• Moving seamlessly from Reality to AR to VR

• Support for Collaboration• Face to Face, Shared AR/VR, Multi-scale

• Natural interaction• Handheld AR and VR viewer

Billinghurst, M., Kato, H., & Poupyrev, I. (2001). The MagicBook: a transitional AR interface. Computers & Graphics, 25(5), 745-753.

Demo: MagicBook

• https://www.youtube.com/watch?v=tNMljw0F-aw

2. Distributed Synchronous Collaboration

• Remote people working at same time• E.g. Collaborative VR, remote tabletops, shared browsers

• Advantages• Remote users in same collaborative space, spatial cues

• But: can’t convey same face to face cues

Example: Social VR

• Facebook Spaces, AltspaceVR• Bringing Avatars into VR space• Natural social interaction

Demo: Facebook Spaces (2016)

https://www.youtube.com/watch?v=PVf3m7e7OKU

3. Distributed Asynchronous Collaboration

• Collaboration at different time and different place• E.g. messages in VR, web annotation tools, doc. markup

• Advantages• Time for more considered response, work whenever• Combine information from many sources, better discussions

4. Co-Located Asynchronous Collaboration

• Collaborating at the same location but different times• E.g. Public displays, shared physical message walls, AR annotations• Not well studied for information visualisation

• Advantages• Collaborators viewing same physical space• Can use external objects to support collaboration (pens, notes)

Example: Hydrological Data Visualization

Hydrosys uses AR to display locations of stations in a global

sensor network as well as interpolated

temperature plotted as geodesic contours

Support for asynchronous

annotationImage: Eduardo Veas and Ernst Kruijff

Veas, E., Kruijff, E., & Mendez, E. (2009). HYDROSYS-first approaches towards on-site monitoring and management with handhelds. J. Hrebıcek, J. Hradec, E. Pelikán, O. Mırovský, W. Pillmann, I. Holoubek, TB, editor, Towards e-environment, EENVI2009, Prague, Czech Republic.

Mixed Presence Collaboration

• Combining collaborative spaces• Connect both co-located and distributed collaborators

Examples

• Many examples• Mixed Presence tabletop with multiple people at each end• CAVE VR connecting between multiple people

• Advantages• Support for distributed collaboration, benefits of face to face groups

• Challenges• Support for mutual awareness, representation of remote users

Lessons Learned

• In co-located systems the following is important:• supporting different independent viewpoints• enabling the use of different tools for different data• supporting face-to-face group work• support for different data representations

Marai, G. E., Forbes, A. G., & Johnson, A. (2016, March). Interdisciplinary immersive analytics at the electronic visualization laboratory: Lessons learned and upcoming challenges. In Immersive Analytics (IA), 2016 Workshop on (pp. 54-59). IEEE.

General Guidelines

• In general, collaborative systems should support:• Shared context – knowledge/context around data• Awareness of others – aware of others actions• Negotiation and communication – easy conversation• Flexible and multiple viewpoints - depending on roles

Churchill, E. F., Snowdon, D. N., & Munro, A. J. (Eds.). (2012). Collaborative virtual environments: digital places and spaces for interaction. Springer Science & Business Media.

Importance of Roles

• Asymmetic/Symmetric problem solving• Teacher/student vs. equal collaborators

• Three different levels of engagement [Isenberg 2011]:• Viewing: where people are consuming a data presentation

without interacting with the data, such as in a lecture.• Interacting/exploring: where people have the means to

choose alternate views or explore the data.• Sharing/creating: people are able to create and distribute

new datasets and visualizations to be explored.

• Need to design the interface differently for each role

Methods for Interacting in CIAs

• Goal: Natural interaction that supports collaboration• Techniques used

• Pointing and gestures – hand or full body• Dedicated devices – e.g. handheld tablet• Multimodal – touch + speech• Tangible interfaces – physical objects• Collaborative actions – working together

Opportunities for Research• Many opportunities for research

• Using VR for CIA• HMD vs. CAVE performance

• Next generation collaboration• Using AR/VR for FtF/remote collaboration

• Evaluation of CIA systems• Subjective/objective measures, cognitive evaluation

• Methods for Asynchronous collaboration• Especially remote asynchronous systems

• Novel interaction methods• Multimodal input, gaze based system, etc

• Exploring the CIA design space• Interaction metaphors, design patterns

Example: Holoportation (2016)

• Augmented Reality + 3D capture + high bandwidth • http://research.microsoft.com/en-us/projects/holoportation/

Holoportation Video

https://www.youtube.com/watch?v=7d59O6cfaM0

Example: Empathy Glasses (CHI 2016)

• Combine together eye-tracking, display, face expression• Impicit cues – eye gaze, face expression

++

Pupil Labs Epson BT-200 AffectiveWear

Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.

AffectiveWear – Emotion Glasses

• Photo sensors to recognize expression• User calibration• Machine learning• Recognizing 8 face expressions

Empathy Glasses in Use

• Eye gaze pointer and remote pointing• Face expression display• In future integrated eye-tracking/display

Empathy Glasses Demo

https://www.youtube.com/watch?v=CdgWVDbMwp4

CONCLUSION

Conclusion• Need for research on Collaborative Visualisation

• Less than 5% of Visualisation papers

• New area: Collaborative Immersive Analytics• Visual Analytics + Mixed Reality + CSCW• Using Immersive Technologies• Early promising results – e.g. CAVE2 case studies

• Different classes of CIA systems• Classify according to space/time taxonomy

• Many directions for future research• Interaction, evaluation, asynchronous collaboration, etc.

www.empathiccomputing.org

@marknb00

mark.billinghurst@unisa.edu.au

top related