competing attractions, orbital decay and the music of the ......proceedings of korean...

8
Competing Attractions, Orbital Decay and the Music of the Spheres: Force–based Relational Dynamics for Organizing Space and Timbre in Performance Using Physical Models Richard Graham Sensory Computation/Experimental Narrative Environments Lab Stevens Institute of Technology, USA [email protected] http://rickygraham.com Brian Bridges School of Creative Arts and Technologies Ulster University, UK [email protected] http://brianbridges.net Proceedings of Korean Electro-Acoustic Music Society's 2016 Annual Conference (KEAMSAC2016) Seoul, Korea, 15-16 October 2016 This paper describes the mapping of embodied metaphors found within physical systems to the spatial organization of voices and timbral processes. The intention of such an approach is to enhance the clarity and richness of connections between performance gestures and sonic structures. Previ- ous system iterations have presented mappings informed by ecological–embodied metaphors of dynamic forces as a means to bridge cross-domain performance events across physical and figurative planes. The first iteration sought to reify gravitationally based tonal pitch space models by map- ping melodic syntax computations (such as attraction, tension, and inertia analogues) to in–kind parameters of a flocking algorithm as a method of dynamic audio spatialization. Given the embodied physical bases implied by musical models proposed by Lerdahl and Smalley, we present a system that further explores the ecological bases of musical abstraction through the lens of force–based mapping models for spatial audio and timbral processing. The present iteration of the system utilizes a physics engine in a game development environment as a base for a practice–led explora- tion of mappings encompassing a wider variety of force–relational dynamics (derived from instrumental note–events) as applied to the evolution of spatial and timbral gestures. A particular focus is the treatment of energy-motion trajectories within the system’s mapping. While spatialization and diffusion is an obvious output modality for such a mapping, practice–led explorations of these embodied concepts, as facilitated by this system, may also inform a relational model of timbral connections. The impetus for our live performance system design is largely based on the extension of an electric guitar practice through multichannel audio processing per instrumental register. Our chosen processes focus on the extraction of instrumental performance data, which are utilized to forge gestural narratives between recognizable physical or figurative performance events and timbral and spatial audio signal processing. This paper outlines the motivational basis, theoretical underpinnings and current innovations of this project. Metaphors and Mapping Strategies for Data Rich Performance Environments The initial motivation for this project was largely based on the desire to extend the sonic characteristics of a conventional electric guitar beyond its conventional de- sign. Previous iterations of our performance system have drawn influence from a series of fields, including embod- ied cognition and human-computer interaction. These systems incorporated metaphorical mappings based on familiar physical gestures to provide more intuitive ac- cess points for the listener and to allow the performer to manage complex sonic materials during a real-time in- strumental performance. Emergent mapping strategies have included the a melodic model for spatialization (Graham and Bridges 2014a, 2014b), and more recently, a force-based model based on temporal and frequency– based continuity and coherence to drive spatial and tim- bral morphologies (Graham and Bridges 2015). This pa- per presents a series of recent system developments, including a physics and collision detection environment, driven by real-time physical and figurative performance gestures. These recent developments provide a useful framework for a composer/improviser to explore the notion of order to disorder (or a loose-tight continuum) in the composition of musical materials. Emerging Live Performance System Designs for Multichannel Guitar The basis of our new system is the Nu Series Modular Active Pickup by Cycfi Research in the Philippines. This low-power system boasts a hacker-friendly design, low impedance coils, and a low-noise preamplifier for each module (DeGuzman 2016). These innovations in multi-

Upload: others

Post on 24-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Competing Attractions, Orbital Decay and the Music of the ......Proceedings of Korean Electro-Acoustic Music Society's 2016 Annual Conference (KEAMSAC2016) Seoul, Korea, 15-16 October

CompetingAttractions,OrbitalDecayandtheMusicoftheSpheres:Force–basedRelationalDynamicsforOrganizingSpaceandTimbreinPerformance

UsingPhysicalModels

RichardGrahamSensoryComputation/ExperimentalNarrativeEnvironmentsLab

StevensInstituteofTechnology,[email protected]://rickygraham.com

BrianBridgesSchoolofCreativeArtsandTechnologies

UlsterUniversity,[email protected]://brianbridges.net

ProceedingsofKoreanElectro-AcousticMusicSociety's2016AnnualConference(KEAMSAC2016) Seoul,Korea,15-16October2016

Thispaperdescribesthemappingofembodiedmetaphorsfoundwithinphysicalsystemstothespatialorganizationofvoicesandtimbralprocesses.Theintentionofsuchanapproachistoenhancetheclarityandrichnessofconnectionsbetweenperformancegesturesandsonicstructures.Previ-oussystemiterationshavepresentedmappingsinformedbyecological–embodiedmetaphorsofdynamicforcesasameanstobridgecross-domainperformanceeventsacrossphysicalandfigurativeplanes.Thefirstiterationsoughttoreifygravitationallybasedtonalpitchspacemodelsbymap-pingmelodicsyntaxcomputations(suchasattraction,tension,andinertiaanalogues)toin–kindparametersofaflockingalgorithmasamethodofdynamicaudiospatialization.GiventheembodiedphysicalbasesimpliedbymusicalmodelsproposedbyLerdahlandSmalley,wepresentasystemthatfurtherexplorestheecologicalbasesofmusicalabstractionthroughthelensofforce–basedmappingmodelsforspatialaudioandtimbralprocessing.Thepresentiterationofthesystemutilizesaphysicsengineinagamedevelopmentenvironmentasabaseforapractice–ledexplora-tionofmappingsencompassingawidervarietyofforce–relationaldynamics(derivedfrominstrumentalnote–events)asappliedtotheevolutionofspatialandtimbralgestures.Aparticularfocusisthetreatmentofenergy-motiontrajectorieswithinthesystem’smapping.Whilespatializationanddiffusionisanobviousoutputmodalityforsuchamapping,practice–ledexplorationsoftheseembodiedconcepts,asfacilitatedbythissystem,mayalsoinformarelationalmodeloftimbralconnections.

The impetus for our live performance system design islargely based on the extension of an electric guitarpractice through multichannel audio processing perinstrumentalregister.Ourchosenprocessesfocusontheextraction of instrumental performancedata,which areutilized to forge gestural narratives betweenrecognizable physical or figurative performance eventsand timbral and spatial audio signal processing. Thispaper outlines the motivational basis, theoreticalunderpinningsandcurrentinnovationsofthisproject.

MetaphorsandMappingStrategiesforDataRichPerformanceEnvironments

The initialmotivation for this projectwas largely basedon the desire to extend the sonic characteristics of aconventional electric guitar beyond its conventional de-sign.Previousiterationsofourperformancesystemhavedrawninfluencefromaseriesoffields,includingembod-ied cognition and human-computer interaction. Thesesystems incorporated metaphorical mappings based onfamiliar physical gestures to provide more intuitive ac-

cesspointsforthelistenerandtoallowtheperformertomanage complex sonic materials during a real-time in-strumental performance. Emergent mapping strategieshave included the a melodic model for spatialization(GrahamandBridges2014a,2014b),andmorerecently,aforce-basedmodelbasedontemporalandfrequency–basedcontinuityandcoherencetodrivespatialandtim-bralmorphologies (GrahamandBridges2015). Thispa-per presents a series of recent system developments,includingaphysics and collisiondetectionenvironment,driven by real-time physical and figurative performancegestures. These recent developments provide a usefulframework for a composer/improviser to explore thenotionoforder todisorder (ora loose-tightcontinuum)inthecompositionofmusicalmaterials.

EmergingLivePerformanceSystemDesignsforMultichannelGuitar

The basis of our new system is the Nu Series ModularActive Pickup by Cycfi Research in the Philippines. Thislow-power system boasts a hacker-friendly design, lowimpedance coils, and a low-noise preamplifier for eachmodule (DeGuzman 2016). These innovations in multi-

Page 2: Competing Attractions, Orbital Decay and the Music of the ......Proceedings of Korean Electro-Acoustic Music Society's 2016 Annual Conference (KEAMSAC2016) Seoul, Korea, 15-16 October

Richard Graham / Brian Bridges – Competing Attractions, Orbital Decay and the Music of the Spheres: Force–based Relational Dynamics for Organizing Space and Timbre in Performance using Physical Models

2 ProceedingsofKoreanElectro-AcousticMusicSociety's2016AnnualConference(KEAMSAC2016)

channelaudiopickupdesignare idealforthenextstageof development for our system, particularly given thewidefrequencyresponseofeachpickupmoduleandtheimproved crosstalkperformance throughpermalloy ringshieldsanddiscretepreampdesign.Theguitarpassesthemultiple individuatedaudiosignalsusinga19-pinLEMOconnectorcable,whichalsocarriespowerto thepickupsystem.Overall, the systemcarriesninechannelsofau-dio:eightindividualaudiochannelsperregisterandonechannel carrying a sum of the existing mono magneticpickupspositionedat theneckandbridgeof theguitar.Aswithourpreviousiterations,theaudiofeedsmaythenbesent toanyanalogordigitalaudioprocessingdeviceusingouravailablebreakoutboarddesigns(GrahamandHarding2015).

Figure1.CurrentMultichannelGuitarSystemDesignincorporatingNuModular,LEMO,PureData,andUnity

ThisversionofoursystemusesabasiccircuitdesignbyGraham (2016) designed for compatibility with analogsynthesizer modules. Audio channels are then fed to amultichannelaudiointerfacetocapturepitch,amplitude,and spectral flux data per instrumental string register.Datamay then be organized into higher-level tonal ab-stractions,scaled,andmappedtoestablishnewgesturalnarrativesbetweentheperformer,performancesystem,andresultingsonicstructures.Inpreviousiterations,thesystemsdesignprimarilyutilizedthePureData(Pd)visu-al programming environment for spatialization and tim-bral processing. This iterationwill seek to develop a bi-directional and multidimensional relationship between

the feature extraction and effects processing patchesprogrammedinPdandamoresophisticatedphysicsandcollision detection system foundwithin theUnity gamedevelopmentplatform.

AnIntroductiontoVESPERSatSCENELab

In 2016, Graham co-founded a research facility atStevensInstituteofTechnology(USA)withMessrs.Cluett,Manzione, and O’Brien named the SensoryComputation/Experimental Narrative Environments(SCENE) Lab. The goal of SCENE Lab is to create hybridsoftwareandhardwaresystemsforthedevelopmentandpresentation of immersive virtual spaces. SCENE Labhouses the Virtual Experience, Sonic Projection,ExtendedRealitySystem(VESPERS)comprisingofa24.2high-density loudspeakerarray.Thesystemcanfunctionas three individual eight-channel loudspeaker systems,including a first-order ambisonics (FOA) cube and ahigher-order ambisonics (HOA) ring. This first-order 3Dloudspeaker system permits musical experimentationusing performative models that exploit height orelevationcueswithinvirtualrealityenvironments.

Figure 2. VESPERS: SCENE Lab’s 24.2 high-density loudspeaker arrayincludinganFOA3D8-channelcubeandHOA2D16-channelring.

Previous iterationsofour liveperformance systemena-bled the composer/improviser to superimpose or mapabstract musical structures or image schemas onto aphysical performance space. An obvious progression toaccommodate more detailed musical models is to in-creasetheresolutionanddimensionsofourloudspeakerarrayaccordingly.VESPERSalso integrates theHTCVive(Valve 2016) virtual reality headset system, which pro-vides highly sophisticated motion capture through twobasestationsemittingpulsedIR lasers.Thismotioncap-ture system provides high precision tracking for userhead movements and other bodily gestures throughcompatiblehandcontrollers.Themultidimensional,non-

Page 3: Competing Attractions, Orbital Decay and the Music of the ......Proceedings of Korean Electro-Acoustic Music Society's 2016 Annual Conference (KEAMSAC2016) Seoul, Korea, 15-16 October

Richard Graham / Brian Bridges – Competing Attractions, Orbital Decay and the Music of the Spheres: Force–based Relational Dynamics for Organizing Space and Timbre in Performance using Physical Models

ProceedingsofKoreanElectro-AcousticMusicSociety's2016AnnualConference(KEAMSAC2016) 3

linear tools provided by the VESPERS environment pro-videan idealbaseforthedevelopmentofmore layeredmusical models for real-time instrumental performancesystems.

ExploringaTonalModelin3DSpace

Previous iterations of our performance system utilizedmelodic syntax computations to inform the steeringbehavioursofaflockingalgorithmforspatialization.

Figure 3. Dynamic Tonal-Spatial Mappings from Graham and Bridges(2015)

Figure4.Introducingthecross-platform[lerdahl]externalforPureDatawrittenbyGraham(2016)

Graham has since developed an external for the PureData (Pd) visual programming language. Version 1receives any MIDI input range and outputs values for

pitch class, basic space, closure, tension, ratios ofasymmetrical attraction and pitch class distance. Theusermaychangetheconfigurationofthebasicspacetoaccommodate any of the seven modes of the majorscale. The user may also offset pitch class zero if theywant zero to be somethingother thanMIDI note C3orC4. Version 2 will have list outputs and more usefulconstruction arguments. These discrete values haveproven to be a useful representation of real-timefigurativegestures.GiventheabilitytosupportelevationcuesinVESPERS,thispaperpresentsmappingstrategiesextendingourtonal-spatialmodeltoexploitpitchheightschemaswithinSCENELab’s3DFOAcube.

UnityandPureData:PhysicsinformedTimbral

andSpatialProcessingPriortoourcurrentsystem’sdesign,wefocusedonmet-aphors found in flocking, gravity, and tonality. Thesemapping strategies may be further developed in an in-teractive environment using the physics and collisiondetection components within the gaming developmentenvironment,Unity.

Figure 5. Third-person controller interactingwith objectswithUnity’sphysics engaged, with variable mass, material and drag. Gravity mayalsobemanipulatedinreal-time.

Unityprovidesanidealcanvasformulti-dimensionalanduser-directed interaction. Spatialization may now bedriven by physical gestures captured using the highlysophisticated Lighthousemotion tracking system (Valve2016).EarlytestsusingtheVivehand-controllersallowedany individual to morph the spatiotemporal structurewith minimal latency. A performer may morph timbraland spatial parameters through regular or familiar in-strumental hand gestures (Graham and Cluett 2016).

Page 4: Competing Attractions, Orbital Decay and the Music of the ......Proceedings of Korean Electro-Acoustic Music Society's 2016 Annual Conference (KEAMSAC2016) Seoul, Korea, 15-16 October

Richard Graham / Brian Bridges – Competing Attractions, Orbital Decay and the Music of the Spheres: Force–based Relational Dynamics for Organizing Space and Timbre in Performance using Physical Models

4 ProceedingsofKoreanElectro-AcousticMusicSociety's2016AnnualConference(KEAMSAC2016)

However,thispaperwillfocusontheextraction,scaling,and mapping of discrete digital audio signals from ourinstrumental sound source as opposed to physical per-formance gestures explored in the aforementioned pa-per byGrahamandCluett. The goal of futureworkwillbetocombinebothphysicalandfigurativesystemswith-inavirtualscenetocreateanembodiedandinteractivemusicalexperience.

NavigatingAVirtualPerformanceSpace

Figure 6. The Reality-Virtuality Continuum based on Milgram et al.(1994)

Milgram’s “Reality-Virtuality Continuum” (1994 p. 283)presents a compelling series of questions regarding theintegration of amusical instrument based performancesystemwithinavirtual realityenvironment:wheredoesthiscombinationfallonthecontinuumandwhataretheimplications for the composer/improviser?Arewedeal-ingwiththeaugmentationofthelistener’srealenviron-ment through a virtual environment? The performer’sexperienceispotentiallyandsolelybasedwithinavirtualenvironment. As we consider this question, one mayconsiderSmalley’sideassurroundingspectromorphology(1997)andspaceforms(2007)toguidethedevelopmentofnewmappingsbetweenrealandvirtualworlds.Thereareanumberofhighlysuitabledefinitionsdirectlyappli-cablewithinourdevelopingvirtualenvironmentandmayinformourpracticalexamples. TonalPitchSpace

Subdivision of spectral space intointervallicsteps

SpectralSpace The impression of space and spa-ciousness produced by the range ofaudiblefrequencies

MicrophoneSpace

Intimacyoftheimageismagnified

Table 1. Useful definitions from Smalley (2007 p. 55) for establishingnewmappingswithinourvirtualperformancespaceInprevious system iterations,abstract tonalpitch space

datawasusedtodrivespatializationparameters. Inthisiteration,theperformerisabletophysicallyinteractwithsound objects within a virtual space using instrumentalgestures.OnecanrelatethistoSmalley’snotionofagen-tial space, “a space articulated by human (inter)actionwith objects, surfaces, substances, and built structures,etc. Combines with utterance space to create enactedspace” (Smalley 2007 p. 55). The following practical ex-amples illustrate how one can extract and exploit the“energy–motion trajectory” and “spectromorphologicalexpectation” (Smalley 1997) of each note-event per-formed by the improviser/composer using time-frequencyanalysistoolsinPd.

Figure 7.WilliamBrent’sTimbreID library forPureData (Pd)providesusefulcomputationsforlow-leveltimbralfeatures

ThemappingsuseanewimplementationofBrent’sbark-scalebasedtimbralanalysisPdexternals(Brent2016).These externals are especially useful for performancesystemsdesignaimedatextractingandreifyingthevari-ous energy profiles associated with timbral structuresfound in each individual audio input signal. Bark spec-trum versions of lower level features, such as spectralflatness,centroid,andflux,aremoreusefultoadesignerfocusing on perceivable change in musical structures,particularly changes representativeof aperformer’sen-ergy-motionoutputmodalities.

Page 5: Competing Attractions, Orbital Decay and the Music of the ......Proceedings of Korean Electro-Acoustic Music Society's 2016 Annual Conference (KEAMSAC2016) Seoul, Korea, 15-16 October

Richard Graham / Brian Bridges – Competing Attractions, Orbital Decay and the Music of the Spheres: Force–based Relational Dynamics for Organizing Space and Timbre in Performance using Physical Models

ProceedingsofKoreanElectro-AcousticMusicSociety's2016AnnualConference(KEAMSAC2016) 5

SupportingVideoExamples

The following section presents a series of practicalmapping strategies exploiting tonal, spectral, andmicrophonespaceoutlinedabove.

DrivingTonal-SpatialMappingsUsingPhysicsinUnity

Figure 8. Individual audio channels per register are mapped to 3DobjectsinUnity

In this example1,weexplore thenotionofpitchheight,firstbysimplymappingourbasic spacecone tobasic z-axislocations,fromthegroundplanetothetopplaneortheFOAcube. Individual audio channels aremapped tosoundobjectswithinthevirtualspace.Theirpositionontheverticalplaneisdeterminedbythebasicspacevalueofthenoteperformedbythe instrumentalist. Inthese-condexample2,weapplythecomputationsfromtheme-lodicsyntaxexternalforPdtocontrolflockingbehaviorswithinUnity in3Dspace.The incorporationofelevationcues using our FOA cube is incredibly effective in rein-forcingthesuperimposedtonal-spatialmodelwithinthevirtual reality environment, particularly given the abilitytoassignmass,drag,andgravitybasedinteractionsusingphysicswithinUnity.UsingSpectralFluxtoDetermineRigidbodyAttributesIn this example3, spectral flux values are scaled andmappedtodeterminethemassanddragofeachindivid-ual sound object within the virtual performance envi-ronment. Data is captured and sent over open-soundcontrol (OSC) to control Unity parameters using a C#script.

1SeeExample1inKEAMSACPlaylist:http://bit.ly/2dtUa742SeeExample2inKEAMSACPlaylist:http://bit.ly/2dtUa74 3SeeExample3inKEAMSACPlaylist:http://bit.ly/2dtUa74

Figure 9. Instrumental features extracted in Pd may be scaled andmapped to control Rigidbody attributes in Unity using Open SoundControl(OSC)

Using Attack Detection, Amplitude Envelope, andSpectralFluxtoDetermineEnergy-MotionProfiles

In our previous design (2015), we used note attackdetectionincombinationwithabasicenvelopefollowertodeterminegravitationalattractionorcentricityrelativetoacentralpointwithintheperformancespace.Thiswasa limited albeit somewhat effective mapping strategyfrom the point of view that it produced a perceptuallyclearanddistinctmusicaleffectrelativetoaperformer'sonsetenergywhenperforminganynoteonanyregisterof the multichannel guitar system. The continuantportion of the event was then tracked to allow thesystem to determine the ideal termination phase, atwhich point a new series of signal processes may beappliedtothereal-timeinstrumentalaudioinputsource.

AttackorOnset

SustainorContinuant

ReleaseorTermina-tion

Emergence Prolongation Disappearance

Departure Passage Arrival

Launching Plane GoalTable2.Smalley’sbasicenergy-motionprofilesandtheorizedembod-iedassociationsbyGrahamandBridges(2015)

Page 6: Competing Attractions, Orbital Decay and the Music of the ......Proceedings of Korean Electro-Acoustic Music Society's 2016 Annual Conference (KEAMSAC2016) Seoul, Korea, 15-16 October

Richard Graham / Brian Bridges – Competing Attractions, Orbital Decay and the Music of the Spheres: Force–based Relational Dynamics for Organizing Space and Timbre in Performance using Physical Models

6 ProceedingsofKoreanElectro-AcousticMusicSociety's2016AnnualConference(KEAMSAC2016)

Inthisiteration4,thesizeandtrajectoryofsoundobjectswillcontinuetobedeterminedbytheamplitudeenve-lopeoftheirrespectiveinstrumentalregister.Ourpro-pulsionideamaybefurtherdevelopedwithinaphysics-sensitivespacerelativetopeakamplitudeatnoteonset,theamplitudeenvelopeatcontinuantandterminationphases,andthespectralfluxweightedusingthebark-frequencyscale.

Figure10.Attack-Projection-LinearityfromGrahamandBridges(2015)

In our 2015 iteration, our attack-projection-linearitymapping juxtaposed strong note attacks against weaknote attacks, with strong note attacks resulting in astronglinearmotionacrossthequadrantsoftheperfor-mancespaceandweakattacksassuminga randomspa-tial trajectory. In this previous iteration, strong projec-tionsalsomorphedtimbraleventsbetweenmonophonicandmultichanneldistortionorgainstructures.Themon-ophonic distortion structures presented a much moreintegrated scene, whereas the multichannel distortionpresented a much more segregated scene. This notionwas largely based on the previously discussed order todisorder (or a loose-tight) continuum. In this iteration,weextend thesepreviousmappingsbyexploring transi-tions between Pitch Synchronous OverLap-Add (PSOLA)based granular synthesis and jitter-based/randomizedgranular clouds using Graham’s [gramulator~] abstrac-tionforPd.

4SeeExample4inKEAMSACPlaylist:http://bit.ly/2dtUa74

Figure11.Graham’s[granulator~]systemforPd(2016)

This granulation abstraction permits the composer tomorphseamlesslybetweenlinearplaybackofastreamoftime-stretched grains and randomly concatenatedstreamsofgrains inreal-time.Streamsofgrainsmaybeconfined to a static spatial location within the perfor-mance space or they may be panned dynamicallythroughout he space, relative to the aforementionedtonal-spatialmodel. Streamsof grainsmay also assumetheir own unique gain structure per stream (macro) orpergrain(micro).UsingSpectralFluxtoDetermineTimbralCharacteristicsofEachRegisterInthisexample5,spectralspacemaybefurtherexploredusing continuous values representative of energy pernote-eventmapped toall-pass filter frequency controls.This audiomixer plugin for Unity was created using PdandEnzienAudio’sonlineplatform,“Heavy”(2016).Exploring Microphone Space Intimacy in Relation toMusicalDynamicsThis relationship between musical dynamics of a real-timeinstrumentalistandthree-dimensionalscenehasanimmediate impact on the perception of the immersivespaceand intimacyof theoverallauditoryscene. Inthisexample6,weutilizetheambisonicB-Formatzoom func-tion authored by David Malham and ported to Pd andMax/MSP externals by Matthew Paradis (2002). Thisfunctionallowsausertozoominonadefinedpointofafirst-ordersoundfieldpositionedwithinthevirtualspace.

5SeeExample5inKEAMSACPlaylist:http://bit.ly/2dtUa746SeeExample6inKEAMSACPlaylist:http://bit.ly/2dtUa74

Page 7: Competing Attractions, Orbital Decay and the Music of the ......Proceedings of Korean Electro-Acoustic Music Society's 2016 Annual Conference (KEAMSAC2016) Seoul, Korea, 15-16 October

Richard Graham / Brian Bridges – Competing Attractions, Orbital Decay and the Music of the Spheres: Force–based Relational Dynamics for Organizing Space and Timbre in Performance using Physical Models

ProceedingsofKoreanElectro-AcousticMusicSociety's2016AnnualConference(KEAMSAC2016) 7

Figure12.SoundfieldmanipulationsfromMalham(2008)

Malham’sB-Format zoom implementationallowsauserto choose a specific point within a three-dimensionalsoundfield and zoom in or away from that point. Asidefrom making sounds in the defined direction appearlouder,thistechniquewillalsoreducetheangularspreadof sounds in the opposite direction. This is an effectivetool forsculptingmusicaldynamics ingeneralandmorespecifically for exploiting the aforementioned micro-phone space where the intimacy of the image is in-creasedthroughmagnification.Conclusion:MovingTowardsaPitch-Timbre-SpaceModelforVirtualRealityBasedMusic

PerformanceEnvironments We have presented a developing interactive model forextended instrumental practice. Our system extractsnoteevents,pitch,amplitude,andspectraldataperreg-ister and utilizes this data to inform timbral and spatialprocesseswithinourdevelopingphysics-driven scene inUnity. This system will permit the user to engage withprimitiveandhigher-level,moreabstractmusicalideasinan interactive environment governed by their musicalchoicesandbodilymovements.Futureworkwillexplorethe development of a more detailed timbral spacewherebysectionsoftheabstractedvirtualspacemaybedivideddetailedaxes,sectors,orquadrantsrepresentingmorecomplex spatiotemporal structures. Soundobjectsmayalsodirectlyinteractwithoneanotherbasedontheperformer’s physical positionwithin the space. This ap-

proachwouldensuremorefluidperformanceinteraction,wherebytheperformerisfreedfromtheconfinesofthephysicalworld, and is given free reignover spatial loca-tion behaviors based on commonly understood gravita-tional interactions and familiar everyday physical ges-tures.

Acknowledgments:Wewould like toextendour thanksto the faculty at SCENE Lab and Dr. Kelland Thomas atStevens Institute of Technology for the continued sup-portonthisproject.

ReferencesGraham,R. (2012)ExpansionofElectronicGuitarPerformancePractice through theDevelopmentof InteractiveDigitalMusicSystems.Ph.D.Thesis.Londonderry:UlsterUniversity.

Graham,R.andBridges,B.(2014a).StrategiesforSpatialMusicPerformance:PracticalitiesandAestheticsforResponsiveMap-pings”inDivergencePress.Issue3.Online.

Graham, R. and Bridges, B. (2014b). Gesture and EmbodiedMetaphor in Spatial Music Performance Systems Design. InProceedingsoftheInternationalConferenceonNewInterfacesforMusicalExpression.pp.581–584.London:Goldsmiths.

Graham,R.andBridges,B.(2015).ManagingMusicalComplexi-ty with Embodied Metaphors. In Proceedings of the Interna-tionalConferenceonNewInterfacesforMusicalExpression.pp.103-106.BatonRouge:LouisianaStateUniversity.

Graham,R.andCluett,S.(2016).TheSoundfieldasSoundOb-ject:VirtualRealityEnvironmentsasaThree-DimensionalCan-vas for Music Composition. In Proceedings of the Audio forAugmentedandVirtualRealityConference.LosAngeles:AudioEngineeringSocietyAnnualConference.

Graham, R. and Harding, J. (2015). SEPTAR: Audio BreakoutCircuit forMultichannelGuitar. In Proceedings of the Interna-tionalConferenceonNewInterfacesforMusicalExpression.pp.241-244.BatonRouge:LouisianaStateUniversity.

Johnson, M. (2007). The Meaning of the Body: Aesthetics ofHumanUnderstanding.Chicago:UniversityofChicagoPress.

Lerdahl,F.(2001).TonalPitchSpace.Oxford:OxfordUniversityPress.

Malham, D. (2008). Spatial Hearing Mechanisms and SoundReproduction. Retrieved fromhttp://www.york.ac.uk/inst/mustech/3d_audio/ambis2.htm onSeptember302016.

Malham, D. (2002). B-Zoom. Retrieved fromhttp://www.york.ac.uk/inst/mustech/3d_audio/vst/bzoom_help.htmlonSeptember302016.

Milgram, P. / Takemura, H. / Utsumi, A. / Kishino, F. (1994).AugmentedReality:A classofdisplayson the reality-virtualitycontinuum. Telemanipulator and Telepresence Technologies.vol.2351(34).pp.282-292.Washington:SPIE.

Reynolds, C. (1983). Boids Flocking Algorithm. Retrieved fromwww.red3d.com/cwr/boids/onSeptember302016.

Page 8: Competing Attractions, Orbital Decay and the Music of the ......Proceedings of Korean Electro-Acoustic Music Society's 2016 Annual Conference (KEAMSAC2016) Seoul, Korea, 15-16 October

Richard Graham / Brian Bridges – Competing Attractions, Orbital Decay and the Music of the Spheres: Force–based Relational Dynamics for Organizing Space and Timbre in Performance using Physical Models

8 ProceedingsofKoreanElectro-AcousticMusicSociety's2016AnnualConference(KEAMSAC2016)

Roth, M. (2016). Enzien Audio / Heavy. Retrieved fromhttp://enzienaudio.comonSeptember302016.

Smalley, D. (1997). Spectromorphology: explaining sound-shapes.OrganisedSound,2(02),pp.107-126.Cambridge:Cam-bridgeUniversityPress.

Smalley,D. (2007).Space-formand theacousmatic image.Or-ganisedSound12(1):pp.35–58.Cambridge:CambridgeUniver-sityPress.