the schepens eye research institute an affiliate of harvard medical school the effect of edge...

Post on 28-Mar-2015

215 Views

Category:

Documents

2 Downloads

Preview:

Click to see full reader

TRANSCRIPT

The Schepens Eye Research InstituteAn Affiliate of Harvard Medical School

The Effect of Edge Filteringon Vision Multiplexing

Henry L. Apfelbaum,

Doris H. Apfelbaum, Russell L. Woods, Eli Peli

SID 2005

May 23, 2005 41-2 Boston, MA

A A

Motivation

• Our lab is developing devices to help people with low vision

Motivation

• Our lab is developing devices to help people with low vision– Central field loss (e.g., macular degeneration)

Motivation

• Our lab is developing devices to help people with low vision– Central field loss (e.g., macular degeneration)– Peripheral vision loss (“tunnel vision”)

Tunnel vision

Motivation

• Our lab is developing devices to help people with low vision– Central field loss (e.g., macular degeneration)– Peripheral vision loss (“tunnel vision”)

• Our devices employ vision multiplexing

Motivation

• Our lab is developing devices to help people with low vision– Central field loss (e.g., macular degeneration)– Peripheral vision loss (“tunnel vision”)

• Our devices employ vision multiplexing– Two different views presented to one or both eyes

simultaneously

Vision multiplexing: HUD

Motivation

• Our lab is developing devices to help people with low vision– Central field loss (e.g., macular degeneration)– Peripheral vision loss (“tunnel vision”)

• Our devices employ vision multiplexing– Two different views presented to one or both eyes

simultaneously– For tunnel vision, we have spectacles with a see-

through minifying display

See-through minifying HMD

a

a

See-through minifying HMD

Camera

a

a

See-through minifying HMD

Camera

Display

a

a

See-through minifying HMD

Beam-splitter

Camera

Display

a

a

Motivation

• Our lab is developing devices to help people with low vision– Central field loss (e.g., macular degeneration)– Peripheral vision loss (“tunnel vision”)

• Our devices employ vision multiplexing– Two different views presented to one or both eyes

simultaneously– For tunnel vision, we have spectacles with a see-

through minifying display– We edge-filter the display to emphasize detail

needed for orientation and navigation

See-through HMD

Motivation

• Can the brain handle it?

Neisser & Becklen experiment (1975)

Count the slap attempts

Did you see her?

Motivation

• Can the brain handle it?

Motivation

• Can the brain handle it?

• Inattentional blindness

Motivation

• Can the brain handle it?

• Inattentional blindness:– Failure to notice significant events in one

scene while attention is focused on another scene

Motivation

• Can the brain handle it?

• Inattentional blindness:– Failure to notice significant events in one

scene while attention is focused on another scene

• Hypothesis: Edge filtering can mitigate inattentional blindness

Our experiment

• We reproduced the Neisser and Becklen experiment, introducing edge filtering to see if unexpected events would be noticed more readily

Our experiment

• We reproduced the Neisser and Becklen experiment, introducing edge filtering to see if unexpected events would be noticed more readily

• 4 attended/unattended scene filtering combinations:

Full video over full video

Filtered ballgame over full handgame: Bipolar edges

Filtered ballgame over full handgame: White edges

DigiVision edge filter output

Filtered handgame over full ballgame

Both games edge-filtered

Our experiment

• We reproduced the Neisser and Becklen experiment, introducing edge filtering to see if unexpected events would be noticed more readily

• 4 attended/unattended scene filtering combinations

Our experiment

• We reproduced the Neisser and Becklen experiment, introducing edge filtering to see if unexpected events would be noticed more readily

• 4 attended/unattended scene filtering combinations

• 6 unexpected event scenes:

Unexpected events

Juggler Lost ball Umbrella woman

Choose-up Handshake Ball toss

Trials

• 36 subjects

• 4 practice trials

• 8 scored trials – Each game attended in half of the trials– 6 showed the 6 unexpected events– 2 had no unexpected event– All 4 filtering treatments used with each game– Edge/edge combination used for the trials without

unexpected events – Treatment/unexpected event pairings and

presentation order were balanced across subjects

Trials (cont’d)• Subject clicked a mouse at each ball toss or

hand-slap attempt in the attended game

Trials (cont’d)• Subject clicked a mouse at each ball toss or

hand-slap attempt in the attended game

• Questions asked after each trial:

Trials (cont’d)• Subject clicked a mouse at each ball toss or

hand-slap attempt in the attended game

• Questions asked after each trial:– How difficult was that?– Any particularly hard parts?

Trials (cont’d)• Subject clicked a mouse at each ball toss or

hand-slap attempt in the attended game

• Questions asked after each trial:– How difficult was that?– Any particularly hard parts?– Anything in the background that distracted you

or interfered with the task?

Trials (cont’d)• Subject clicked a mouse at each ball toss or

hand-slap attempt in the attended game

• Questions asked after each trial:– How difficult was that?– Any particularly hard parts?– Anything in the background that distracted you or

interfered with the task?

• We scored– Number of unexpected events detected

Trials (cont’d)• Subject clicked a mouse at each ball toss or

hand-slap attempt in the attended game

• Questions asked after each trial:– How difficult was that?– Any particularly hard parts?– Anything in the background that distracted you or

interfered with the task?

• We scored– Number of unexpected events detected– Hits rate (mouse click close to attended event)– Average response time to attended event “hits”

Results: Unexpected event detections

0

2

4

6

8

10

0 1 2 3 4 5 6number of events detected

subj

ects

•57% of the 216 unexpected events presented were detected

Results: Unexpected event detections

0

2

4

6

8

10

0 1 2 3 4 5 6number of events detected

subj

ects

•57% of the 216 unexpected events presented were detected

•Only 2 subjects detected all 6 events shown

•One subject detected none

Results: Unexpected event detectionsAttended Full Full Edge

TotalUnattended Full Edge Full

Ball toss 10 10 11 31

Choose-up 9 10 9 28

Juggler 10 7 10 27

Umbrella 8 8 3 19

Handshake 4 3 4 11

Lost ball 3 2 2 7

Total 44 40 39 123

Edge filtering was not significant (p = 0.67)

Results: Attended task accuracy

• Hit rates were high– 95.2% ballgame hit accuracy– 98.2% handgame hit accuracy

Results: Attended task accuracy

• Hit rates were high– 95.2% ballgame hit accuracy– 98.2% handgame hit accuracy– No significant effect of cartooning or unexpected

events

Results: Attended task accuracy

• Hit rates were high– 95.2% ballgame hit accuracy– 98.2% handgame hit accuracy– No significant effect of cartooning or unexpected

events

• Hit response times

Results: Attended task accuracy

• Hit rates were high– 95.2% ballgame hit accuracy– 98.2% handgame hit accuracy– No significant effect of cartooning or unexpected

events

• Hit response times – Event scene had no significant effect (p > 0.65)

Results: Attended task accuracy

• Hit rates were high– 95.2% ballgame hit accuracy– 98.2% handgame hit accuracy– No significant effect of cartooning or unexpected

events

• Hit response times – Event scene had no significant effect (p > 0.65) – Filtering the unattended task had no significant

effect (p = 0.37)

Results: Attended task accuracy

• Hit rates were high– 95.2% ballgame hit accuracy– 98.2% handgame hit accuracy– No significant effect of cartooning or unexpected

events

• Hit response times – Event scene had no significant effect (p > 0.65) – Filtering the unattended task had no significant

effect (p = 0.37)– Filtering the attended task had a significant but

small impact (527 vs 498 ms, p < 0.001)

Conclusions

• Good news: Edge filtering did not materially affect performance of the attended task

Conclusions

• Good news: Edge filtering did not materially affect performance of the attended task

– We know that the relative ease with which salient features can be found in an edge-filtered view aids orientation and navigation

Conclusions

• Good news: Edge filtering did not materially affect performance of the attended task

– We know that the relative ease with which salient features can be found in an edge-filtered view aids orientation and navigation

– Edge filtering also seems to make it easier to distinguish the views

Conclusions

• Good news: Edge filtering did not materially affect performance of the attended task

– We know that the relative ease with which salient features can be found in an edge-filtered view aids orientation and navigation

– Edge filtering also seems to make it easier to distinguish the views

• Surprising news: Edge filtering did not aid (or hinder) the detection of unexpected events

Future

• We plan to test subjects with tunnel vision (who need to scan to view the full scene)

Future

• We plan to test subjects with tunnel vision (who need to scan to view the full scene)

• Some events are much more detectable than others, so we hope to learn more about just what affects detectability

Future

• We plan to test subjects with tunnel vision (who need to scan to view the full scene)

• Some events are much more detectable than others, so we hope to learn more about just what affects detectability

• The context provided when one scene is viewed at two scales (as in our HMD, rather than two different scenes) may affect detectability

Future

• We plan to test subjects with tunnel vision (who need to scan to view the full scene)

• Some events are much more detectable than others, so we hope to learn more about just what affects detectability

• The context provided when one scene is viewed at two scales (as in our HMD, rather than two different scenes) may affect detectability

• Bipolar edges are obviously better than white-only edges. A totally-video HMD could afford that advantage

Acknowledgements

• Ulrich Neisser• Miguel A. Garcia-Pérez• Elisabeth M. Fine• The Levinthal-Sidman JCC• The JCC athletic staff

• James Barabas• Ben Peli• Aaron Mandel• Chas Simmons

• Supported in part by NIH grant EY 12890 and DOD grant W81XWH-04-1-0892

THANK YOU!

http://www.eri.harvard.edu/faculty/peli/index.html

QUESTIONS?

http://www.eri.harvard.edu/faculty/peli/index.html

Results: Attended task hit rates

Hits MissesFalse

Alarms

Ballgame 95.2% 4.8% 5.2%

Handgame 98.2% 1.8% 3.0%

Results: Attended task response times

Unattended scene

(not significant, p = 0.37)

Full Edges

Attended scene

(significant, p < 0.001)

Edges532 ms

(±84)

522 ms

(±96)

Full500 ms

(±97)

496 ms

(±100)

top related