enhanced rendering of fluid field data using sonification and visualization maryia kazakevich may...

Post on 20-Dec-2015

216 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Enhanced Rendering of Fluid Field Data Using Sonification

and Visualization

Maryia Kazakevich

May 10, 2007

2

Multi-Modal Project Overview

• Input: – Fluid field with velocity vector, pressure, plus

potentially density, temperature and other data– Changes with time

• Output:– Visualization of the given fluid field– Sound characterizing given fluid field

• Ambient: global to the whole field• Local: at the point or area of interaction• Local region: particles of the specific subset area around the

pointer contribute to the sound

3

Structure

– Each rendering program is independent of any other

Solution Data Server

Max/MSP Program

Main Program asMax/MSP object

VisualizationProgram

Haptic Program

Haptic Device ImageSound

4

Sonification Types

• Contrast vs. Inverted Amplitude Modulation– velocity value is mapped to either increase or decrease in

amplitude of the sound

• Amplitude vs. Frequency Modulation– highest velocity value is mapped to either loudest noise or highest

pitch noise

• Before vs. after interpolation– many separate sounds for each vertex in the local area vs. 1 sound

of the interpolated value at the position of virtual pointer

5

Usability Study Setup

• Focus on usability study– Compare Visual, Audio and Multi-Modal interface

– Influence of audio setup

– Simple rendering• complex not always better, allows real-time calculations

• learning process

• Overall and single trials have to be fairly short – fatigue

– sound sensitivity

• Many short trials– Large nodes in small field

6

Usability Study Setup

Visual and/or Audio cues, haptic - navigation

7

Usability Study Setup

• Trials for each person:

• 3 groups of people:– frequency modulation– positive amplitude modulation (contrasted sound)– negative amplitude modulation (inverted sound)

8

Usability Study Results

• worst results for the audio-only system:– participants are slower in locating the goal

– participants are less efficient in exploring the volume

– Participants are less precise in locating the goal

9

Usability Study Results

Specific system setup might help to improve performance:

10

Usability Study Results

• Multi-modal vs. Visual-only interface:

• Equal or better results for the multi-modal system– participants explore less space

– participants are much faster in locating the goal position

11

Usability Study Results

Specific system setup helps to improve performance

12

Usability Study Results

In a specific audio setup of multi-modal system, participants are more precise in locating the goal:

13

Usability Study Conclusion

• Multi-modal system is more powerful than either pure visual or pure audio systems

• Specific mapping parameters influence system performance

• Different audio parameters are better for audio-only than for multi-modal interface

• Different audio parameters might be better for different conditions

14

Further work

• More sophisticated rendering algorithms – for visualization– for sonification– for haptic representation of the data

• Sonifying along pathlines, streaklines, streamlines, streamribbons and streamtubes

• Sonifying other data parameters• Fully surrounding sound• Self-adjusting system

The End

top related