from interaction to empathy

63
FROM INTERACTION TO EMPATHY: NEW DIRECTIONS IN INTERFACE TECHNOLOGY Mark Billinghurst [email protected] April 14 th 2016 CHIuXiD 2016 Conference Jakarta, Indonesia

Upload: mark-billinghurst

Post on 14-Apr-2017

889 views

Category:

Technology


0 download

TRANSCRIPT

FROM INTERACTION TO EMPATHY: NEW DIRECTIONS IN INTERFACE TECHNOLOGY

Mark Billinghurst [email protected]

April 14th 2016

CHIuXiD 2016 Conference Jakarta, Indonesia

How did that work? How could I get a whole room of people clapping

together with no instruction?

Clear goal Simple feedback

Well connected network Everyone understood each other

Successful crowd-sourced behaviour

Computing Evolution

• Change in interface over time -> new challenges for design

Interface Design for the Future

• Key topics •  Feedback • Connected Networks • Shared Understanding

Single user

Connected communities

TRENDS IN TECHNOLOGY

Interaction Technology Natural

Time

Punch Card

Keyboard

Mouse

Speech

Gesture

Emotion

1950 1960 1980 1990 2000 2010

Thought

Physiological Sensing

Emotiv Empatica

Interaction Technology Natural

Time

Punch Card

Keyboard

Mouse

Speech

Gesture

Emotion

1950 1960 1980 1990 2000 2010

Thought

Implicit

Explicit

Content Capture Realism

Time

Photo

Film

Live Video

Panorama

360 Video

3D Space

1850 1900 1940 1990 2000 2010

3D Image/Space Capture

Google Project Tango Samsung Project Beyond

Content Capture Realism

Time

Photo

Film

Live Video

Panorama

360 Video

3D Space

1850 1900 1940 1990 2000 2010

2D Static

Immersive

Live

Experience

Networking Speeds Log (b/s)

Time

100 b/s

10 Kb/s

1 Mb/s

1980 1985 1990 1995 2000 2010

100 Mb/s

2005

Network Innovation

Universal Connectivity

Networking Speeds Log (b/s)

Time

100 b/s

10 Kb/s

1 Mb/s

1980 1985 1990 1995 2000 2010

100 Mb/s

2005

Text

Audio

Natural

Video

Holoportation

• Augmented Reality + 3D capture + high bandwidth •  http://research.microsoft.com/en-us/projects/holoportation/

Holoportation Demo

https://www.youtube.com/watch?v=7d59O6cfaM0

Natural Collaboration

Implicit Understanding

Experience Capture

Natural Collaboration

Implicit Understanding

Experience Capture

Empathic Computing

EMPATHIC COMPUTING

Empathy

“Seeing with the Eyes of another,

Listening with the Ears of another,

and Feeling with the Heart of another..”

Alfred Adler

Empathic Computing

1. Understanding: Systems that can understand your feelings and emotions

2. Experiencing: Systems that help you better experience the world of others

3. Sharing: Systems that help you better sharing the feelings of others

Understanding: Affective Computing

• Ros Picard – MIT Media Lab • Systems that recognize emotion

Appliances That Make You Happy

• Jun Rekimoto – University of Tokyo/Sony CSL • Smile detection + smart appliances

Happiness Counter Demo

https://vimeo.com/29169237

Experiencing: Virtual Reality

"Virtual reality offers a whole different medium to tell stories that really connect people and create an empathic connection."

Nonny de la Peña http://www.emblematicgroup.com/

Hunger

• Experience of homeless waiting in food line

https://www.youtube.com/watch?v=wvXPP_0Ofzc

CHILDHOOD

• Kenji Suzuki, University of Tsukuba • What does it feel like to be a child? • VR display + moved cameras + hand restrictors

CHILDHOOD Demo

https://vimeo.com/128641932

Sharing

Can we develop systems that allow us to share what we are seeing, hearing and feeling with others?

•  sdfs

•  sdfgs

•  axcvxca

Using AR/Wearables for Empathy

• Remove technology barriers • Enhance communication • Change perspective • Share experiences • Enhance interaction in real world

Example: Google Glass

• Camera + Processing + Display + Connectivity

• Ego-Vision Collaboration (But with Fixed View)

Current Collaboration on Wearables

• First person remote conferencing/hangouts • Limitations

•  Single POV, no spatial cues, no annotations, etc

Social Panoramas (ISMAR 2014)

• Capture and share social spaces in real time • Supports independent views into Panorama

Reichherzer, C., Nassani, A., & Billinghurst, M. (2014, September). [Poster] Social panoramas using wearable computers. In Mixed and Augmented Reality (ISMAR), 2014 IEEE International Symposium on (pp. 303-304). IEEE.

Implementation

• Google Glass • Capture live image panorama (compass + camera)

• Remote device (tablet) •  Immersive viewing, live annotation

User Interfaces

Glass View

Tablet View

Social Panorama

https://www.youtube.com/watch?v=vdC0-UV3hmY

Lessons Learned

• Good • Communication easy and natural • Users enjoy have view independence • Sharing panorama enhances the shared experience

• Bad • Difficult to support equal input • Need to provide better awareness cues

CoSense (CHI 2015)

• Real time sharing - Emotion, video, and audio • Wearable (send emotion) –> Desktop (remote view)

Google Glass e-Health 2.0 board

+

Ayyagari, S. S., Gupta, K., Tait, M., & Billinghurst, M. (2015, April). Cosense: Creating shared emotional experiences. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (pp. 2007-2012). ACM.

Implementation

Data Capture

Feature Detection

Emotion Recognition

Emotion Representation

Empathic User Interface

Hardware

User Interface

Wearable Interface

• Google Glass + e-Health + Spydroid + SSI • Measure GSR, pulse oxygen, ECG, voice pitch • Share video and audio remotely • Representative emotions sent back to Glass user

!

!

Desktop Interface

CoSense Demo

Lessons Learned

• Good • System was wearable • Sender and receiver mirrored emotion • Minimal cues provided best experience

• Bad • System delays • Need for good stimulus • Difficult to represent emotion

Empathy Glasses (CHI 2016)

• Combine together eye-tracking, display, face expression •  Impicit cues – eye gaze, face expression

+ +

Pupil Labs Epson BT-200 AffectiveWear

Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.

AffectiveWear – Emotion Glasses

• Photo sensors to recognize expression • User calibration • Machine learning • Recognizing 8 face expressions

Integrated System • Local User

• Video camera • Eye-tracking • Face expression

• Remote Helper • Remote pointing

System Diagram

•  Two monitors on Remote User side •  Scene + Emotion Display

Empathy Glasses Demo

Lessons Learned • Pointing really helps in remote collaboration

• Makes remote user feel more connected • Gaze looks promising

•  shows context of what person talking about • More work needed on emotion/expression cues • Limitations

•  Limited implicit cues •  Two separate displays •  Task was a poor emotional trigger • AffectiveWear needs improvement

FUTURE RESEARCH

Looking to the Future

Scaling Up

• Seeing actions of millions of users in the world • Augmentation on city/country level

AR + Smart Sensors + Social Networks

• Track population at city scale (mobile networks) • Match population data to external sensor data • Mine data for applications

Example: MIT SENSEable City Lab

http://senseable.mit.edu/wikicity/rome/

Example: CSIRO WeFeel Tool

• Emotionally mining global Twitter feeds

• http://wefeel.csiro.au

CONCLUSION

Conclusions

• Empathic Computing • Sharing what you see, hear and feel

• AR/Wearables Enables Empathic Experiences • Removing technology • Changing perspective • Sharing space/experience

• Many directions for future research

www.empathiccomputing.org

@marknb00

[email protected]