the allosphere

60
Allosphere@CNSI: Towards a fully Immersive and Interactive Scientific Experience

Upload: xavier-amatriain

Post on 26-Jan-2015

102 views

Category:

Technology


0 download

DESCRIPTION

Presentation of the Allosphere project at UCSB. Imagine a 3 story high sphere suspended in a cube where 3D video and audio are used for scientific discovery and exploration.

TRANSCRIPT

Page 1: The Allosphere

Allosphere@CNSI: Towards a fully Immersive and Interactive Scientific Experience

Page 2: The Allosphere

in partnership with the California Nanosystems Institute

MAT/CNSI

allosphere@CNSI

What is a Digital Media Center doing in a Nanosystems institute?

Page 3: The Allosphere

• A team of digital media researchers at UCSB has been fostering a cross-disciplinary field that unites science and engineering through the use of new media

• Allosphere = Integration + availability to a larger community

Description and Goals

Page 4: The Allosphere

Allosphere Steering Committe

• JoAnn Kuchera-Morin (Media Arts and Technology Initiatives)

• Xavier Amatriain (Media Arts and Technology Initiatives)

• Jim Blascovich (Psychology)

• Forrest Brewer (Electrical and Computer Engineering)

• Keith Clarke (Geography)

• Steve Fisher (Life Sciences)

• B.S. Manjunath (Electrical and Computer Engineering)

• Marcos Novak (Media Arts and Technology/Arts)

• Matthew Turk (Media Arts and Technology/Computer Science)

• T.B.D. (California Nanosystems Institute)

Page 5: The Allosphere

• The Allosphere – synthesis, manipulation, exploration and analysis of

large-scale data sets ....

– environment that can simulate virtually real sensorial perception providing multi-user immersive interactive interfaces

• research into – scientific visualization, numerical simulations, data

mining, visual/aural abstract data representations, knowledge discovery, systems integration and human perception

Description and Goals

Page 6: The Allosphere

• Allosphere and other labs hosted in UCSB’s California Nanosystems Institute (CNSI)

The Building

Page 7: The Allosphere

• The space itself is already a part of the final instrument: – three-story

anechoic sphere, ten meters in diameter, containing a built-in spherical screen.

The Space

Page 8: The Allosphere

• Once equipped, the CNSI Allosphere will be one of the largest immersive instruments in the world. – unique features:

true 3D spherical projection of visual and aural data, and sensing and camera tracking for interactivity.

The Features

Page 9: The Allosphere

• The AlloSphere is situated at one corner of the CNSI building, surrounded by different media labs.

– Visual Computing

– Interactive Installation

– Immersion/Eversion

– Robotics

– Plurilabs

Other MAT Labs at CNSI

Page 10: The Allosphere

Research in the Allosphere

Page 11: The Allosphere

• Inherent research comprises all of the activities that use the instrument as a research framework for immersive, multimodal environments:

Inherent Research

Page 12: The Allosphere

• Sensor and Camera Tracking Systems– research related with

computer vision as well as innovative interfaces and sensor networks that might be used to capture user interaction

Inherent Research. Interactivity

Page 13: The Allosphere

• System Design and Integrated Software/Hardware Research– integration of the different

hardware and software components at play

Inherent Research. Systems

Page 14: The Allosphere

• Immersive Visual Systems Research– re-creation of an immersive

visual space in a spherical environment

Inherent Research. Visual

Page 15: The Allosphere

• Immersive Audio Systems Research– re-creation of a virtual 3D sound environment in

which sources can be placed at arbitrary points in space with convincing synthesis and that allows to simulate the acoustics of real spaces

Inherent Research. Audio

Page 16: The Allosphere

• Functional research includes those activities that will use the Allosphere as a tool for scientific exploration:

Functional Research

Page 17: The Allosphere

• Multidimensional knowledge discovery– deal with issues such as highly dimensional

feature descriptors, similarity metrics, and indexing

– Machine learning, image data mining and understanding...

Functional Research. Knowledge

Page 18: The Allosphere

• Analysis of complex structures and systems– Constructing the next

generation of engineering paradigms requires a mechanism for rapid simulation, visualization and exploration supporting phenomena at multiple physical and temporal scales

Functional Research. Complex Systems

Page 19: The Allosphere

• Human perception, behavior and cognition– valuable instrument for

behavioural scientists interested on the impact of virtual environments, large scale visualization, or spatial hearing.

Functional Research. Psychology

Page 20: The Allosphere

• Cartographic display and Information Visualization– remote sensing and

geographic information science the opportunity to explore the potential of “inside-out” global data displays as tools for collective decision-making

Functional Research. Cartography

Page 21: The Allosphere

• Artistic scientific visualization/auralization– artistic principles are driving

research into real-time interactivity and human manipulation of complex scientific data structures

Functional Research. Artistic Visualization

Page 22: The Allosphere

• Most of the research in the Allosphere (Functional and Inherent) has a direct mapping into future forms of Entertainment and Edutainment.

– We envision collaboration from the Entertainment Industry

The Future of Entertainment

Page 23: The Allosphere

Prototype Projects in the Allosphere

Page 24: The Allosphere

Prototype-driven System

• State-of-the-art system: still many open research questions need to be addressed.

• We want content to drive the system design.

• For that reason we are prototyping the instrument with

different projects/requirements.

Page 25: The Allosphere

The Allobrain

• In collaboration with UCLA Brain Imaging Institute, Marcos Novak and many MAT/CREATE students (view video)

Page 26: The Allosphere

Quantum Spin Precession

• In collaboration with Prof. David Awschalom and Spintronics lab. Audiovisual model for coherent electron spin precession in a quantum dot

Page 27: The Allosphere

Multicenter Hydrogen Bond

• With Anderson Genotti – Matrerials Researcher and discoverer of the Hydrogen Bond – and Prof. Van De Walle. Visualization and multi-modal representation of unique atomic bonds for alternative fuel sources (view video).

Page 28: The Allosphere

NanoCAD in the Allosphere

• In collaboration with BinanGroup's NanoCAD

Page 29: The Allosphere

Alloproteins

• In collaboration with the Chemistry/CS department using Chromium and Vmd

Page 30: The Allosphere

An Engineering Challenge

Page 31: The Allosphere

Innovation

• The Allosphere presents innovative aspects in respect to existing environments such as The Cave

– Spherical environment with 360 degrees of visual

stereophonic information: spherical immersive systems

enhance subjective feelings of immersion, naturalness, depth

and ``reality''.

– It is fully multimedia as it combines latest techniques both on

virtual audio and visual data spatialization. Combined audio-

visual information can help information understanding but

most existing immersive environments focus on visual data.

Page 32: The Allosphere

Innovation

– Completely interactive and multimodal environment, including camara tracking systems, audio recognition and sensor networks.

– Pristine scientific instrument - e.g. the containing cube is

fully anechoic chamber and details such as room modes

or screen reflectivity have been studied.

– Multiuser: Its size allows for up to 15 people to interact

and collaborate on a common research task.

Page 33: The Allosphere

An Engineering Challenge

Page 34: The Allosphere

An Engineering Challenge

The Visual subsystem

Page 35: The Allosphere

10/04/23 AlloSphere Video Design - Alex Kouznetsov UCSB Proprietary and Confidential

35

• Allosphere display can only be compared to high-end state of the art planetariums (Gates planetarium at Denver Museum of Nature&Science or Griffith Observatory in LA)

• Some AlloSphere requirements are considerably more

demanding

– Variety of types of graphics including smaller size text

– Bright backgrounds and accurate color

– Stereo projection

– Excellent system flexibility and expandability

Overview

Page 36: The Allosphere

10/04/23 AlloSphere Video Design - Alex Kouznetsov UCSB Proprietary and Confidential

36

Overview

• Key Design Parameters– Display quality/performance– Mechanical/facilities constraints– Overall system architecture, configuration management,

automation, calibration– Cost

• Secondary Concerns– Aging– Maintenance– Upgrades– Acoustic performance (of video equipment)

The Visual subsystem

Page 37: The Allosphere

10/04/23 AlloSphere Video Design - Alex Kouznetsov UCSB Proprietary and Confidential

37

Display Brightness

• What is required?– Eyestrain-free operation over a decent range of color values– Brightness levels at or above photopic threshold for good contrast

and color acuity– High resolution– Stereo/mono operation

• Given:– Screen area: ~320m2

– Projector overlap factor: 1.7– Screen gain, direction averaged: 0.12– 14 projectors with a max. 3K lumens/projector

Page 38: The Allosphere

10/04/23 AlloSphere Video Design - Alex Kouznetsov UCSB Proprietary and Confidential

38

• Simulation results– ~10 cd/m2 screen luminance per 42,000 lumen of total light input

• Recommendations

– 0.7 – 5 cd/m2 recommended for multimedia domes

– 50 cd/m2 for cinema projection (SMPTE)• Conclusion

– 42K lumens is good enough for most applications

Display Brightness

Page 39: The Allosphere

10/04/23 AlloSphere Video Design - Alex Kouznetsov UCSB Proprietary and Confidential

39

• Active stereo introduces more than 50% loss in brightness but ...

– 5 cd/m2 is still in the high-end of recommendations for domes

– Active stereo introduces a dramatic gain in subjective quality

perception.

• On the other hand, we cannot project much more than that

because of:

– Back reflections

– Cross-reflections

Display Brightness

Page 40: The Allosphere

10/04/23 AlloSphere Video Design - Alex Kouznetsov UCSB Proprietary and Confidential

40

• “Eye-limiting resolution” is not feasible (right now)

– Approx. 150M pixels required

to achieve in 30lp/deg (1 arc

minute) in all directions

• 11 lp/deg (3 arc minute) is the

recommended value for domes

– 20M pixels, 14 projectors

Allosphere line pairs per degree as a function of total number of active pixels on all projectors combined

5

10

15

20

25

30

35

10 30 50 70 90 110 130 150

LP/deg

Resolution

Page 41: The Allosphere

10/04/23 AlloSphere Video Design - Alex Kouznetsov UCSB Proprietary and Confidential

41

• Design requirements relate to all aspects of system design– Projector side

• Best image quality, usually combined with color correction.• Limited configuration• Lower cost and higher flexibility

– Dedicated hardware

• Lower latencies• DLP projectors are problematic due to the extra frame buffer

latency

– Custom Software Infrastructure?

Image Warping and Blending

Page 42: The Allosphere

An Engineering Challenge

The Audio subsystem

Page 43: The Allosphere

Audio Requirements

• “Ear-limited” audio rendering– Flat freq. response 20 Hz – 22kHz

– Dynamic Range 120 dB

– SNR>90 dB

– T60 < 0.75 sec

– Spatial Accuracy: 3° in horizontal axis and 10° in

elevation

Page 44: The Allosphere

Spatial Audio

• Examples of Spatial audio: stereo, surround...

• Geometrical model-based spatialization

– Mono source + dynamic positioning

– Three “standard” techniques:

• Vector-based amplitude panning

• Ambisonic spatialization

• Wafefield Synthesis

Page 45: The Allosphere

Wavefield Synthesis

• Huygens principle of superposition: many closely spaced speakers create a coherent wavefront with an arbitrary source position

• 3D WFS has still not been attempted because of computational

complexity (3D KH integral) : can use Ambisonics on the z axis

Page 46: The Allosphere

Spatial Techniques

• All these techniques present pros/cons and interesting research problems

– We already have a framework that can effectively combine them

• Spatial audio: huge success in the near future

• Number of speakers depends on the specific technique

– but in order to have a reasonable spatial resolution we need

~ 500 speakers

• The technology to use (electrostats, ribbon, tweeter array...)

is also still under discussion.

Page 47: The Allosphere

Input Sensing & Multimodal HCI

Page 48: The Allosphere

Interactivity

• Dynamic, user-driven environment – how to best give users the ability to interact with data in effective, compelling, and natural ways?

• Powerful techniques for navigation,

selection, manipulation, and signaling

• Sense and perceive human movement, gesture,

and speech via a network of sensors

– Cameras, microphones, haptic devices, etc.

– Multimodal interaction!

Page 49: The Allosphere

Computing Infrastructure

Page 50: The Allosphere

Integration

• A typical multi-modal AlloSphere application will integrate services running on multiple hosts on the LAN that implement a distributed system composed of:

– input sensing (camera, sensor, microphone),

– gesture recognition/control mapping,

– interface to a remote (scientific, numerical, simulation, data

mining) application,

– back-end processing (data/content accessing),

– A/V rendering and projection management.

Page 51: The Allosphere

Integration

• Still need software infrastructure to distribute the different graphic pipes from the generation engine to the render farm

• Develop ad-hoc visual generation software engine and

interconnection with data streams.

• Efforts need to be put forward into building this intermediate

integration/coordination layer by combining several specialized

packages

– Cyberinfrastructure grant presented (Hollerer, Wolski and

Shea)

Page 52: The Allosphere

Video Generation Subsystem

• In order to generate high resolution (1920x1200@120Hz) in active stereo we need high end video cards

• Sample rendering farm for 14 stereo channels: 7 servers with one

Quadro FX5600 in each one.

• Blending and warping managed mostly at the projector side.

Sample video generation unit with a render Linux

Box with an NVidia Quadro 5600 (still to appear ) feeding two Christie Mirage S2K+

Page 53: The Allosphere

Video Distribution

Page 54: The Allosphere

Video Distribution

Page 55: The Allosphere

Audio Generation Subsystem

• Problem: Distribute 500+ channels of hi-fi audio to speakers

1.Distributed rendering

~1.3 Gbps (at 24bit/96kHz)

• Multichannel audio streaming over network: Yamaha's mLan,

Gibson's Global Information Carrier, Sony's Supermac...

• Sample synchronous output: Steve Butner's EtherSync

• Network interface box to be custom built

2.Single Render Point

• Develop custom DSP hardware

• Harder signal distribution

Page 56: The Allosphere

Audio Generation Subsystem

Page 57: The Allosphere

Audio Generation Subsystem

Synthesis/Processing Software: Audio team has extensive experience in developing such software and has ready-to-use frameworks such as CLAM (Amatriain, ACM MM Best Open Source Software 2006); CSL (Pope)

Page 58: The Allosphere

Open Research Areas and People

• Graphics (Hollerer)

• Audio (Amatriain)

– Auralization (Roads)

– 3D Audio (Pope)

• Systems (Hollerer, Brewer,

Butner, Pope, Amatriain)

• Interactivity (Turk, Kuchera-

Morin, Amatriain)

• Experiential Signal Processing (Gibson)

• HPC, Optimization (Wolski,

Krintz)

• Content Creation

– Visual (Legrady)

– Music (Kuchera-Morin)

– VW (Novak)

Page 59: The Allosphere

Open Research Areas and People

• Nanoscale systems representation (Oster, Garcia-Cervera)

• Brain Imaging (Grafton)

• Molecular Dynamics (Shea)

• GIS (Clarke et al.)

• Bio-imaging: (Fisher,

Manjunath)

• Perception (Loomis, Beall...)

Page 60: The Allosphere

http://www.mat.ucsb.edu/allosphere

THANKS!