pulse: an infrastructure for collection of audience ...jmk/papers/pulse-final.pdfchose to design our...

3
PULSE: An Infrastructure for Collection of Audience Heartbeats for Music Visualization Joe Geigel * , Peizhao Hu * , Ashita Khetan * , Minseok Kwon * , Susan Lakin , Veronica Lin , Robert McCartney * , Stephen Petoniak , Matthew Tidridge , and Katie Verrant * Department of Computer Science School of Photographic Arts and Sciences School of Design Rochester Institute of Technology, Rochester, NY 14623 E-mail: jmg1590|hxpvcs|ayk5245|mxkvcs|srlpph|vxl2455|rm7536|stp1389|mwt9438|[email protected] Abstract—In this demo, we present an infrastructure for real- time, crowd sourced heartbeat detection designed for use during a musical performance. Specially designed pulse sensors are used in conjunction with mobile devices to measure, transmit, and synchronize users’ individual and collective heartbeat data with procedurally generated visuals as they listen to a pulse inducing musical selection. I. I NTRODUCTION The use of pervasive computing in music [1], theatre [2], and live installations [3], has recently attracted a great deal of attention. Ubiquitous technologies have the potential to trans- form current entertainment paradigms, define new processes, and even create new forms of interactive entertainment [4]. Recent research has focused on both emotional expression using these technologies [5] and software architectures that make such forms of expression possible [6]. In this demo, we present an infrastructure designed for use during a live musical performance. The infrastructure is part of the 01X project: 1 an interdisciplinary effort with the goal of using digital devices to reinvent the live concert experience; breaking down traditional barriers and creating an interactive visual and musical encounter between audience and performer. Our framework addresses three particular recent trends in this area: 1) The use of mobile devices in a live performance setting. 2) The use of biometrics in guiding the creation, presenta- tion and audience experience of a live performance. 3) The use of visualizations that reflect the mood and emotion of the performance participants. In this particular work, we focus on audience heartbeat data and present a framework for obtaining collective pulses from a group of people. This pulse data is then used to guide visualizations during a live performance. Our goal is the creation of an extendible, flexible, and effective platform that utilizes low cost devices. Although designed with musical performance in mind, the framework is general enough to be utilized by any application requiring collective heartbeat data from a crowd. 1 http://www.project01x.com/ II. SYSTEM OVERVIEW An overview of the infrastructure is illustrated in Fig. 1 below. Fig. 1. Framework Architecture The framework consists of the following integral compo- nents: A. Pulse Sensor Though current methods can be used to detect heart beats using the flashlight on a mobile device [7], such methods have been shown to be cumbersome for extended use by an audience member in a lengthy performance context [8]. Instead, we chose to design our own low cost pulse sensor using an off the shelf photoplethysmograph-based sensor. 2 This heartbeat detector is connected to a bluetooth Arduino microcontroller. 3 The components are assembled into an inte- grated unit designed to be worn on the index or ring finger (Fig. 2). Pulse data collected by the device includes Beats per Minute (BPM) and Inter-Beat Interval (IBI). B. Mobile App Each pulse sensor is individually marked using a QR code and syncs to a specific mobile device over bluetooth by scanning this code. A specially designed app receives the stream of heartbeat data from the PULSE sensor to which it is synced and relays this data to a global server. 2 http://pulsesensor.com 3 http://legacy.punchthrough.com/bean/

Upload: others

Post on 11-Jun-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: PULSE: An Infrastructure for Collection of Audience ...jmk/papers/PULSE-final.pdfchose to design our own low cost pulse sensor using an off the shelf photoplethysmograph-based sensor.2

PULSE: An Infrastructure for Collection ofAudience Heartbeats for Music Visualization

Joe Geigel∗, Peizhao Hu∗, Ashita Khetan∗, Minseok Kwon∗, Susan Lakin†, Veronica Lin‡,Robert McCartney∗, Stephen Petoniak‡, Matthew Tidridge‡, and Katie Verrant‡

∗Department of Computer Science†School of Photographic Arts and Sciences

‡School of DesignRochester Institute of Technology, Rochester, NY 14623

E-mail: jmg1590|hxpvcs|ayk5245|mxkvcs|srlpph|vxl2455|rm7536|stp1389|mwt9438|[email protected]

Abstract—In this demo, we present an infrastructure for real-time, crowd sourced heartbeat detection designed for use during amusical performance. Specially designed pulse sensors are usedin conjunction with mobile devices to measure, transmit, andsynchronize users’ individual and collective heartbeat data withprocedurally generated visuals as they listen to a pulse inducingmusical selection.

I. INTRODUCTION

The use of pervasive computing in music [1], theatre [2],and live installations [3], has recently attracted a great deal ofattention. Ubiquitous technologies have the potential to trans-form current entertainment paradigms, define new processes,and even create new forms of interactive entertainment [4].

Recent research has focused on both emotional expressionusing these technologies [5] and software architectures thatmake such forms of expression possible [6].

In this demo, we present an infrastructure designed for useduring a live musical performance. The infrastructure is partof the 01X project:1 an interdisciplinary effort with the goalof using digital devices to reinvent the live concert experience;breaking down traditional barriers and creating an interactivevisual and musical encounter between audience and performer.

Our framework addresses three particular recent trends inthis area:

1) The use of mobile devices in a live performance setting.2) The use of biometrics in guiding the creation, presenta-

tion and audience experience of a live performance.3) The use of visualizations that reflect the mood and

emotion of the performance participants.

In this particular work, we focus on audience heartbeatdata and present a framework for obtaining collective pulsesfrom a group of people. This pulse data is then used toguide visualizations during a live performance. Our goal isthe creation of an extendible, flexible, and effective platformthat utilizes low cost devices. Although designed with musicalperformance in mind, the framework is general enough to beutilized by any application requiring collective heartbeat datafrom a crowd.

1http://www.project01x.com/

II. SYSTEM OVERVIEW

An overview of the infrastructure is illustrated in Fig. 1below.

Fig. 1. Framework Architecture

The framework consists of the following integral compo-nents:

A. Pulse Sensor

Though current methods can be used to detect heart beatsusing the flashlight on a mobile device [7], such methods havebeen shown to be cumbersome for extended use by an audiencemember in a lengthy performance context [8]. Instead, wechose to design our own low cost pulse sensor using an offthe shelf photoplethysmograph-based sensor.2

This heartbeat detector is connected to a bluetooth Arduinomicrocontroller.3 The components are assembled into an inte-grated unit designed to be worn on the index or ring finger(Fig. 2).

Pulse data collected by the device includes Beats per Minute(BPM) and Inter-Beat Interval (IBI).

B. Mobile App

Each pulse sensor is individually marked using a QR codeand syncs to a specific mobile device over bluetooth byscanning this code. A specially designed app receives thestream of heartbeat data from the PULSE sensor to whichit is synced and relays this data to a global server.

2http://pulsesensor.com3http://legacy.punchthrough.com/bean/

Page 2: PULSE: An Infrastructure for Collection of Audience ...jmk/papers/PULSE-final.pdfchose to design our own low cost pulse sensor using an off the shelf photoplethysmograph-based sensor.2

Fig. 2. The Pulse Sensing Device

Other non biometric information can be attached to a devicethrough the app and used as an additional control of thevisualization. For example, in our prototype implementationa user’s name and seat number in an auditorium is maintainedby the app. In addition, an individual color based on the deviceQR code is created by the app to uniquely identify a user usinga particular device in the visualization.

C. Pulse Server

The Pulse Server collects the pulse data from all partici-pating mobile devices, stores it in a database, and makes thedata available to any application via a restful web service.RabbitMQ was used to provide polling and pub/sub messagedelivery services.

An outline of the database schema is illustrated in Fig. 3below. User specific (pulse_user_profile) and devicespecific (Device) data is stored as well as the time baseddetected heartbeat info (data). In addition, auxiliary informa-tion on the musical performance is also maintained (Event).

Fig. 3. Schema of data stored on the PULSE server

D. Visualization Application

The visualization application grabs the pulse data from thepulse server database and uses this data to drive the proceduralcreation of visuals, potentially highlighting both individualpulses as well as the collective pulse of all participants.

In our prototype implementation, we use TouchDesigner,4 avisual development platform for creating realtime rich multi-media experiences. The tool uses a node-based dataflow model

4http://www.derivative.ca/

for producing visuals (Fig. 4). Using this tool, database tablesfrom the PULSE server are accessible via a JSON, web-basedinterface and can be defined as a data source in the tool. Thestored heartbeat data can be combined with features extractedfrom the waveform data of the source audio to drive theparameters of nodes used to produce the visuals. Examplesare presented in Section III.

Fig. 4. Dataflow defined in TouchDesigner used to create visualizations

III. SAMPLE VISUALIZATIONS

An initial prototype of the infrastructure was tested duringthe ImagineRIT festival in Rochester, NY in May of 2015.5

Individual performances were held hourly, six performancesin total over the course of the day. During a single perfor-mance, as many as 25 users participated in the demonstration,each with their own pulse sensor. We describe some of thevisualizations produced in this section.

A direct mapping visualization is illustrated by the Trailsvisualization (Fig. 5). This visualization presents a drawn trailfor each user similar to an EKG, and utilizes each user’s HeartRate (HR) to drive a generated pulse signal presented in thevisualization.

Fig. 5. The Trails Visualization

For the Dancing Lines visualization (Fig. 6), the user’s pulseaffects the velocity of a 3D object, such as a line or cube,while the audio waveform drives the turbulence of the object’smotion. These lines are then fed through a feedback loop that”draws” these lines at each frame, creating a picture that isconstantly building upon itself.

5A video montage of photos from the demonstration can be found athttps://www.youtube.com/watch?v=17tYpu8HF14

Page 3: PULSE: An Infrastructure for Collection of Audience ...jmk/papers/PULSE-final.pdfchose to design our own low cost pulse sensor using an off the shelf photoplethysmograph-based sensor.2

Fig. 6. The Dancing Lines Visualization

The Firework Sphere visualization (Fig. 7) also utilizes theinput audio waveform along with the individual user HR tocreate a unique visual that is driven by both the music andthe user data. The audio drives the shape and rotation of thesphere while the HR drives the rate of the particles emittedfrom each user’s name on the vertices of the sphere.

Fig. 7. The Fireworks Sphere Visualization

For the last two visuals, the Beating Icons visualization (Fig.8) and the Swarming Arrows visualization (Fig. 9), the seatnumber maintained by each audience member’s instance ofthe mobile app, along with their individual heart rate data, areused in constructing the generated graphics.

For the Beating Icon visualization, the user’s individual iconpulses and glows with every beat of his/her heart as read bythe pulse monitor on his/her finger. With this visualization,users were able to use proximity to their neighbors to mapout their own heart rate and compare it to those around them.

For the Arrow Swarm visualization, arrows are emitted fromeach user’s position using the measured heart rate. After beingcreated, the arrow particles exhibit flocking behavior. Eachuser’s position in the audience is used as an attractor of varyingstrength based on the speed of each HR. For example a personwith a higher overall Heart Rate would be read as a strongerattractor. Because these variables are in constant fluctuation,each visual experience paints a different picture based on theuser group.

Other data maintained by the app, such as the user’s nameand assigned color, were utilized to further diversify the visualsand allow each user to stand out and be identifiable against

Fig. 8. The Beating Icon Visualization

Fig. 9. The Arrow Swarm Visualization

other participants while still contributing to the overall graphicexperience.

IV. DEMONSTRATION

For the demonstration session, we will be showing the work-ings of each of the individual components of the infrastructureand describe how they are combined to construct the completeframework.

REFERENCES

[1] D. Keller, L. V. Flores, M. S. Pimenta, A. Capasso, and P. Tinajero,“Convergent trends toward ubiquitous music,” Journal of New MusicResearch, vol. 40, no. 3, pp. 265–276, 2011.

[2] C. B. Owen, A. Dobbins, and L. Rebenitsch, “Integrating the audienceinto a theatre performance using mobile devices,” International Journalof Pervasive Computing and Communications, vol. 10, no. 1, pp. 4–26,2014.

[3] U. Ekman and M. Fuller, Throughout: art and culture emerging withubiquitous computing. MIT Press, 2013.

[4] J. Williamson, L. Hansen, G. Jacucci, A. Light, and S. Reeves, “Un-derstanding performative interactions in public settings,” Personal andUbiquitous Computing, vol. 18, no. 7, pp. 1545–1549, 2014.

[5] E. van den Broek, “Ubiquitous emotion-aware computing,” Personal andUbiquitous Computing, vol. 17, no. 1, pp. 53–67, 2013.

[6] J. Burke, J. Friedman, E. Mendelowitz, H. Park, and M. B. Srivastava,“Embedding expression: Pervasive computing architecture for art andentertainment,” Pervasive and Mobile Computing, vol. 2, no. 1, pp. 1– 36, 2006.

[7] P. Pelegris, K. Banitsas, T. Orbach, and K. Marias, “A novel method todetect heart beat rate using a mobile phone,” in Engineering in Medicineand Biology Society (EMBC), 2010 Annual International Conference ofthe IEEE, Aug 2010, pp. 5488–5491.

[8] Y.-Y. Fan and R. Weber, “Capturing audience experience via mobilebiometrics,” in Proceedings of the 18th International Conference onAuditory Display, June 2012, pp. 214–217.