taptop: using the laptop chassis as a musical...

6
Taptop: Using the Laptop Chassis as a Musical Controller Oliver Thompson 1 and Christopher Harte 2 1 Department of Electronics, University of York, UK [email protected] 2 Melodient Limited, Leeds, UK [email protected] Abstract: This paper presents Taptop, a new gesture control application designed for controlling drum machines, samplers and other software virtual instruments running on laptop computers. Without requiring external electronic hardware, Taptop turns the chassis of the laptop into a velocity sensitive control surface for triggering samples by combining data collected from the built-in webcam and microphone to produce MIDI messages. The user taps on the unused space either side of their laptop mouse trackpad which is divided into a number of virtual ‘drum pads’. Each virtual pad causes a different MIDI note to be output when tapped, with the velocity of that note determined by the force of the tap. This control solution aims to provide a more affordable and portable alternative to the hardware MIDI controller, and increase the potential of the computer as a standalone performance instrument. Keywords: Gesture Control, Drum Pads, Velocity Sensitive, Live Performance, Musical Interface Introduction Software drum machines and samplers are an important part of the workflow of many electronic music producers. Yet, despite the prominent role of these instruments in the modern music studio, the options for controlling them remain restricted to a limited number of methods. This is an issue not only associated with software drum machines and samplers, but other virtual instruments too. While significant advances continue to be made in developing more powerful and flexible virtual instruments, the ways in which musicians interact with those instruments has remained largely the same since the inception of the Digital Audio Workstation (DAW) Theberge (1997). When composing with a virtual instrument, a musician will often arrange sequences of notes via their DAW piano roll. This method of control obscures the musician’s interaction with their instrument behind software, in contrast to an acoustic instrument where the physical gestures that produce the sound are immediately apparent to the audience (Dobrian 2001). Well thought out mapping of physical gestures to musical parameters could therefore provide a more expressive method of control for software instruments, and help to reintroduce some of the performance element that is often lost when making music with a computer (Bown, Bell, and Parkinson 2014). For the purposes of musical interaction, it is important that a gesture control system provides the musician with the opportunity to develop a repertoire and a degree of musicality over time through practice, as would be the case with a more traditional instrument. When asked what instrument they play, few computer musicians would respond with ‘I play the computer’. This is largely due to the fact that the majority of human-computer interaction interfaces leave little room for progression or virtuosity (Wessel and Wright 2002). The familiar method of controlling a computer through a keyboard and mouse or trackpad may be intuitive, but it is also incredibly restrictive as a musical interface (Pavlovic, Huang, and Sharma 1997). Such devices are unable to capture velocity information and hence limit musical expression. However, the keyboard and mouse are not the only sensors available to us on a modern laptop, indeed Fiebrink et al have investigated the use of various built-in sensors including microphones, accelerometers, and webcams among others for controlling music (Fiebrink, Wang, and Cook 2007). Hong and Yeo used acoustic feedback between the computer speaker and its own microphone to drive audio DSP for experimental audio-visual performance (Hong and Yeo 2013).

Upload: trinhthu

Post on 18-Mar-2018

216 views

Category:

Documents


3 download

TRANSCRIPT

Taptop:UsingtheLaptopChassisasaMusicalControllerOliverThompson1andChristopherHarte2

1DepartmentofElectronics,UniversityofYork,[email protected]

2MelodientLimited,Leeds,[email protected]

Abstract:ThispaperpresentsTaptop,anewgesturecontrolapplicationdesignedforcontrollingdrummachines,samplersandothersoftwarevirtualinstrumentsrunningonlaptopcomputers.Withoutrequiringexternalelectronichardware,Taptopturnsthechassisofthelaptopintoavelocitysensitivecontrolsurfacefortriggeringsamplesbycombiningdatacollectedfromthebuilt-inwebcamandmicrophonetoproduceMIDImessages.Theusertapsontheunusedspaceeithersideoftheirlaptopmousetrackpadwhichisdividedintoanumberofvirtual‘drumpads’.EachvirtualpadcausesadifferentMIDInotetobeoutputwhentapped,withthevelocityofthatnotedeterminedbytheforceofthetap.ThiscontrolsolutionaimstoprovideamoreaffordableandportablealternativetothehardwareMIDIcontroller,andincreasethepotentialofthecomputerasastandaloneperformanceinstrument.

Keywords:GestureControl,DrumPads,VelocitySensitive,LivePerformance,MusicalInterface

IntroductionSoftwaredrummachinesandsamplersareanimportantpartoftheworkflowofmanyelectronicmusicproducers.Yet,despitetheprominentroleoftheseinstrumentsinthemodernmusicstudio,theoptionsforcontrollingthemremainrestrictedtoalimitednumberofmethods.Thisisanissuenotonlyassociatedwithsoftwaredrummachinesandsamplers,butothervirtualinstrumentstoo.Whilesignificantadvancescontinuetobemadeindevelopingmorepowerfulandflexiblevirtualinstruments,thewaysinwhichmusiciansinteractwiththoseinstrumentshasremainedlargelythesamesincetheinceptionoftheDigitalAudioWorkstation(DAW)Theberge(1997).Whencomposingwithavirtualinstrument,amusicianwilloftenarrangesequencesofnotesviatheirDAWpianoroll.Thismethodofcontrolobscuresthemusician’sinteractionwiththeirinstrumentbehindsoftware,incontrasttoanacousticinstrumentwherethephysicalgesturesthatproducethesoundareimmediatelyapparenttotheaudience(Dobrian2001).Wellthoughtoutmappingofphysicalgesturestomusicalparameterscouldthereforeprovideamoreexpressivemethodofcontrolforsoftwareinstruments,andhelptoreintroducesomeoftheperformanceelementthatisoftenlostwhenmakingmusicwithacomputer(Bown,Bell,andParkinson2014).

Forthepurposesofmusicalinteraction,itisimportantthatagesturecontrolsystemprovidesthemusicianwiththeopportunitytodeveloparepertoireandadegreeofmusicalityovertimethroughpractice,aswouldbethecasewithamoretraditionalinstrument.Whenaskedwhatinstrumenttheyplay,fewcomputermusicianswouldrespondwith‘Iplaythecomputer’.Thisislargelyduetothefactthatthemajorityofhuman-computerinteractioninterfacesleavelittleroomforprogressionorvirtuosity(WesselandWright2002).Thefamiliarmethodofcontrollingacomputerthroughakeyboardandmouseortrackpadmaybeintuitive,butitisalsoincrediblyrestrictiveasamusicalinterface(Pavlovic,Huang,andSharma1997).Suchdevicesareunabletocapturevelocityinformationandhencelimitmusicalexpression.However,thekeyboardandmousearenottheonlysensorsavailabletousonamodernlaptop,indeedFiebrinketalhaveinvestigatedtheuseofvariousbuilt-insensorsincludingmicrophones,accelerometers,andwebcamsamongothersforcontrollingmusic(Fiebrink,Wang,andCook2007).HongandYeousedacousticfeedbackbetweenthecomputerspeakeranditsownmicrophonetodriveaudioDSPforexperimentalaudio-visualperformance(HongandYeo2013).

Figure1.Taptopinuse.Anangledmirrorallowsthesystemtodetecttheuser’shandmotioninthedrumpadareausingthebuiltinwebcam.Theuserhasachoiceofdividingthelaptopchassisinto2,4or8drumpads,selectedviaaseriesofbuttonsontheGUI.Thedrumpadsincreaseinwidthasfewerpadsareselected,sothattheentiretyofthelaptopchassiswidthisalwaysbeingutilised.Theangleofthemirrorattachmentisadjustablesothatitcanbeused

withavarietyoflaptopmodels.

ExternalMIDIcontrollersgosomewayinenhancingthemusicalperformancecapabilitiesofthecomputer,buttheseareoftenexpensivedevicesandconstituteanotherpieceofequipmentforamusiciantotransportoraccommodateintheirstudio(GillianandNicholls2012).AsystemthatprovidesthesamefunctionalityasaMIDIcontrollerwithouttherequirementforexternalelectronichardwarewouldthereforepossesssignificantadvantages.Tothisend,wehavedevelopedTaptop,agesturecontrolsoftwareapplicationthatusesthelaptop’sownmicrophoneandcameratodetecttheuser’sactions.Usingthesebuilt-insensors,weallowunusedspaceonthelaptopchassisitselftobeusedasawellestablishedinterfaceforcontrollingvirtualdrummachinesandsamplers,namelyabankofdrumpads.Drumpadsareanintuitiveinterfacethatanyonecanusetotriggersamples,buttheyalsoallowaperformertodeveloptheirskillssufficientlyenoughovertimetobeconsideredanexpert(Zamborlinetal.2014);theelectronicmusicianJeremyEllisbeingagoodexample(Ellis2011).

Theconceptofusingmicrophonestoturnagenericsurfaceintoaninterfaceforcontrollingdigitalaudioisnotnew.Anumberofsystemshavealreadybeensuccessfullyimplementedusingthisidea,themostwidelyknownprobablybeing‘Mogees’(Zamborlin2016).Mogeesaresmalldevicescontainingacontactmicrophonewhich,whenconnectedtoasmartphoneorcomputer,allowtheusertodriveasynthesisengineinaccompanyingsoftwaresimplybyhittingorrubbingasurface.Asinglemicrophoneallowsthesystemtocapturetimbreandtransientinformationbutsomeotherapproacheshaveusedmultipletransducerstoallowpositioninformationtobecapturedaswell.ForexampleBisbyetal.(2014)developedaninstrumentusinganarrayofpiezostoachieveaccuratelocalisationofsoundsourcesonasurface,whileNovelloandRaijekoff(2015)utilisedapairofcontactmicrophonestolocatethegesturesoftheuseronasurface,mappingthepositionofmovementtothepitchoftheinstrument.

SystemOverviewTaptopwasimplementedinthePureDatagraphicalprogramminglanguage,makingextensiveuseoftheimageandvideoprocessingobjectsthatcomprisetheGEMlibrary.Insteadofphysicaldrumpads,theusertapsintheunusedspaceeithersideoftheirlaptopmousetrackpadwhichisdividedintoanumberofhorizontallyarrangedvirtualpads,representedbycolouredboxesinfigure1.Thelaptopwebcamfieldofviewispositionedoverthisareabymeansofamirrorattachment;theonlyexternalaccessoryrequiredtouseTaptop.Toassisttheuserinpositioningthemirrorattachment,thewebcamstreamisdisplayedinawindowoverlaidwithablackrectanglethatindicateswherethemousetrackpadshouldbelocated.

TaptopcontrolsvirtualinstrumentsviaMIDI.InordertotriggeraMIDInote,twocriteriamustbemet.Thisprocessisdescribedintheflowchartinfigure2.Firstly,theaudibletapoftheuser’sfingeronthechassisofthelaptopmustbedetectedthroughanalysisofthelaptopmicrophonesignal.ThelevelofthistapcorrespondstothevelocityofanyMIDInotesubsequentlytriggered.Secondly,movementmustbefoundtobeoccurringinatleastoneofthevirtualdrumpadsbyamotiondetectionalgorithmappliedtothewebcamsignal.EachofthesedrumpadscorrespondstoadifferentMIDInotenumber.Whenatapisdetectedinthemicrophonesignal,MIDInoteonmessagesaresenttothevirtualinstrumentbeingcontrolledwithnotenumbersdeterminedbythemovementoccurringinthedrumpads.

Figure2.Dataflowinthesystem.TaptoptranslatesdatafromthelaptopwebcamandmicrophonetoproduceMIDImessageswhicharethenusedtocontrolavirtualinstrument.

DetectingMovement

Taptopcropsthelaptopwebcamsignalintoanumberofregions,eachrepresentingavirtualdrumpad.Itisthemovementoftheuser’sfingersintheseregionsthatdeterminewhichnotenumbersareencodedintotheMIDInoteonmessagesthataresent.Movementisdetectedbycomparingthecurrentandpreviousframesofeachregiontodeterminewhichpixelshavesignificantlychanged.ThevideostreamisconvertedfromanRGBtoanRGBAcolourspacesothatpixelsthatdemonstrateachangeinthesesubsequentframescanbestoredintheemptyalphachannel.Ifthenumberofpixelsstoredinthealphachannelisaboveasetthreshold,thesystemassumesthatmovementisoccurringinthatdrumpad.

DetectingTransients

Todetecttheaudibletapoftheuser’sfingeronthelaptopchassis,weusedamethodbasedonthe‘Bonk’PureDataobject(Puckette1997).Bonkisatransientdetectorthathastheabilitytodeterminewhichinstrumentwasresponsibleforaparticularattackbycomparingspectralchangestoasetofstoredtemplates.Tapsonthelaptopchassiswererecordedatavarietyofvelocities,thenstoredastemplatessothatBonkcouldbetaughttoignoreotheraudiobeingpickedupbythemicrophone.Thishelpedaddressthepossibleissueoffeedbackinthesystem,whereplayingbackdrumsamplesthroughloudspeakerscouldcausetheapplicationtotriggeritself.Whilesomedrumhitsmayhaveasimilarenvelopetothetemplates,thespectraldifferencesbetweenthetwoshouldcausethemtobeignoredbyBonkinmostcases.

Transient detection

Motion detection

Motion detection

Motion detection

Partition and crop

Pad 1

Pad 2

Pad n

Note x

Note y

Note z

Video

Audio Events with velocity

MIDI output

Virtual instrument

MIDI Note On

Pad→note mapping

Pad→note mapping

Pad→note mapping

Mirror

User taps chassis

Web cam

Microphone

UserTestingAseriesofusertestswerecarriedoutinordertodeterminethesuccessofTaptopasamusicalinterface.Tenparticipantswereselectedwhoeachhadbackgroundsinelectronicmusicproduction.Thiswastoensurethatthosetakingpartinthetestswouldhavetherelevantknowledgetoprovideinformedopinionsontheperformanceandfeaturesoftheapplication.Twolinesofinvestigationwereemployed;afocusgrouptoestablishiftheconceptwaspopularamongmusicproducers,followedbymoredetailedbetatestingtoascertainhowproducersutilisedTaptopoveranextendedperiodoftime.Thefocusgroupandbetatestinvolvedfiveparticipantseach.

Methodology

Inthefocusgroup,participantswerefirstgivenanintroductiontotheconceptbehindTaptopandthecurrentissuesinvirtualinstrumentcontrolthatitwasdesignedtoaddress.Theyweretheninstructedonhowtousetheapplicationbeforebeinggiventenminutesalonetotryitoutforthemselvescontrollingavirtualinstrument.Thetestswerecarriedoutundercontrolledconditionsinaroomwheretheleveloflightingcouldbeadjusted.ThismeantthattheperformanceofTaptopcouldbeevaluatedinbright,dark,andoptimallightingtoreflectthedifferentenvironmentsthatitmaybeusedin.Afterthetestswerecompleted,theparticipantseachansweredaquestionnaireabouttheirexperienceswiththeapplication.

Forbetatesting,theparticipantswereeachprovidedwithacopyofTaptoptotryintheirownstudiosetups.Overtheperiodofaweek,theycouldusetheapplicationinwhateverwaytheychose.Afterthistimehadelapsed,theparticipantssubmittedfeedbackonhowithadperformed.ThetwokeythingstoestablishfromthiswerethevariouswaysinwhichtheparticipantsutilisedTaptop,andiftheywouldthenconsideradoptingtheapplicationovertheirusualmethodofcontrolattheendofthetrialperiod.

Results

Inthefocusgroupquestionnaireresults,thetwomostfrequentanswersgivenbytheparticipantsforthefeaturesofTaptoptheylikedthemostwereitsself-contained,‘inthebox’format,anditsabilitytocontroltheplaybackvelocityofsamplesthroughaphysicalaction.TheconceptofamusicalinterfacebeingcompletelyintegratedintothesamedeviceasthesoundsourceprovedattractivetothoseparticipantswhomakeuseofhardwareMIDIcontrollers.Althoughmanufacturersputeffortintomakingsuchdevicesmoreportable,nothavingtoaccommodateoneatallwouldbedesirabletoamusicianwhohastotravelorwhohaslimitedstudiospace.ForthoseparticipantswhouseaDAWpianorolltocontroltheirvirtualinstruments,theinputofnotevelocitythroughgesturewasseenasamoreexpressivealternativetoprogrammingvelocityautomationwithinaDAW.

ThemostcommonfeatureofTaptopthattheparticipantsdislikedwasitsreliability.Iflightingwasnotatanoptimumlevelthentheapplicationhaddifficultyindetectingtheusersmovement,meaningthatsamplesweresometimesnottriggeredwhentheyweresupposedtobe.Sincetheenvironmentsthatelectronicmusicisperformedinaregenerallydimlylitwithsporadicburstsofverybrightlight,thisisasignificantproblemthatneedstobeaddressed.Anotherconcernraisedbytheparticipantswasthattappingalaptoptoohardinanareawheretheharddriveislocatedmayresultinthedrivebeingdamaged.Thisisespeciallyrelevantformagneticdiskdriveswhicharemoresusceptibletoimpactdamagethansolid-statestorage(Pinheiro,Weber,andBarroso2007).Ithasyettobeestablishedwhetherornotlongtermuseoftheapplicationhasanaffectontheharddriveofthelaptopthatitisrunon.

TheresultsofthebetatestingaredisplayedinTable1.Feedbackwasgenerallypositivewith3outofthe5participantsstatingthattheywouldconsiderusingTaptopinplaceoftheirusualmethodofcontrol.Participant2encounteredissueswithPureDatabeingunabletostreamvideofromthelaptopwebcam,thoughttobecausedbyacompatibilityproblembetweenthe‘pix_video’PureDataobjectandcertainversionsoftheMacOSXoperatingsystem.Forthisreason,theywereunabletogettheapplicationworkingintheirstudiosetupandthereforestatedthattheypreferredtheirusualmethodofcontroltoTaptop.

Table1.Resultsofbetatesting.FeedbackwasgenerallypositivewiththreecandidateswillingtoconsiderusingTaptoptoreplacetheircurrentcontrolmethod.*Participant2wasunabletorunthesystemduetoacompatibility

problemwithPureDataontheirlaptop.

Participant5alsostatedthattheypreferredtheirusualmethodofcontrol,aMIDIkeyboard,toTaptop.Whiletheythoughtthattheapplicationwasusefulforgeneratingideasonthemove,theyenjoyedthetactilefeedbackofkeysbeingdepressedwhenperformingonahardwaredevice.ItisinterestingtonotethatParticipant4usedTaptopinconjunctionwithsoftwaresynthesisersratherthanwithsoftwaredrummachinesandsamplers,aswasitsintendeduse.Theycommentedthattheapplicationprovidedamoreimmediatewayofinterfacingwithsynthesiserswhenauditioningpatchesthantheirusualmethodofcontrol,aDAWpianoroll.Participants1and3usedTaptopforitsoriginalpurpose,controllingdrummachineandsamplervirtualinstruments.TheybothstatedthattheypreferredTaptopovertheirusualhardwaresetupsduetotheeaseofportabilitythatitaffordedthem.

ConclusionWehavedemonstratedareal-timemethodofvirtualinstrumentcontrolthatusesonlythebuilt-inwebcamandmicrophonethatcomeasstandardwithmostlaptopcomputers.Thedevelopmentofthisapplicationsoughttoaddressanumberoftheissuesassociatedwithcurrentmethodsofvirtualinstrumentcontrol,particularlyconcerningsoftwaredrummachinesandsamplers.OneobjectivewastoreleasemusiciansfromtheburdenoftransportingandaccommodatinghardwareMIDIcontrollersthroughagesturecontrolsystemthatpossessedthesamefunctionality.Thiswasachievedbyexclusivelyutilisingthebuilt-infeaturesofalaptopcomputertocontrolthevirtualinstrument;theonlyexternalcomponentrequiredbeingasmallmirrorattachment.BybasingTaptop’scontrolmetaphoronabankofdrumpads,thefunctionalityoftheapplicationislargelyidenticaltoanalreadywellestablishedmusiccontrolinterface.

Anotherobjectivewastoexplorethepotentialofthecomputerasaperformanceinstrumentthroughthedevelopmentofanexpressivemethodofcontrolforvirtualinstruments.Normally,withouttheuseofanexternaldevice,musiciansarerestrictedtocomposingonacomputerviatheirDAWpianoroll.Sincethismethodofcontrolisincapableoffacilitatingreal-timeperformance,itisunderstandablewhymanymusicianshesitatetoclassthecomputerasamusicalinstrument(WesselandWright2002).However,withsuchawealthofmusicnowbeingmadeexclusivelyusingcomputers,itseemsstrangethatthisshouldbethecase.Taptopexpandsthepotentialofthecomputerasamusicalinstrumentbyenablingalaptoptobeusedforstandaloneperformance,wheretheaudiencecandirectlyrelatethephysicalgesturesoftheperformertothemusictheyarehearing(CicilianiandMojsysz2014).

FurtherworkneedstobedoneinimprovingTaptop’sreliability,particularlywhenbeingusedinadverselightingconditions.Ifthelightingistoobrightortoodark,themotiondetectionalgorithmislesslikelytobeabletodetectusermovement.Makingthemotiondetectionalgorithmadaptivetodifferentlightingconditionscouldbetheanswertothis,sothatindifficultenvironmentsthesensitivityisautomaticallyloweredincreasingthechancesofusermovementbeingdetected.

ReferencesBisby,Helena,AdamCooper,StephenJames,AlecRobertson,andKiaNg.2014.“ATactileAudioVisualInstrumentUsingSoundSourceLocalisation.”InProceedingsofthe2014ElectronicVisualisationandtheArtsConference,200–206.

Bown,Oliver,RenickBell,andAdamParkinson.2014.“ExaminingthePerceptionofLivenessandActivityinLaptopMusic:ListenerInferenceAboutWhatthePerformerIsDoingfromtheAudioAlone.”InProceedingsofthe2014InternationalConferenceonNewInterfacesforMusicalExpression,13–18.

Ciciliani,Marko,andZenonMojsysz.2014.“EvaluatingaMethodfortheAnalysisofPerformancePracticesinElectronicMusic.”InProceedingsofthe2014InternationalConferenceonLiveInterfaces,45–61.

Dobrian,Christopher.2001.“AestheticConsiderationsintheUseofVirtualMusicInstruments.”JournaloftheSocietyforElectro-AcousticMusicintheUnitedStates16(2):23–33.

Ellis,Jeremy.2011.“JeremyEllisPerformingonMaschine-UnlikeAnyOther.”https://www.youtube.com/watch?v=FcJCxe1VSLA[Accessed:21Feb.2016].

Fiebrink,Rebecca,GeWang,andPerryCook.2007.“Don’tForgettheLaptop:UsingNativeInputCapabilitiesforExpressiveMusicalControl.”InProceedingsofthe2007InternationalConferenceonNewInterfacesforMusicalExpression,164–67.

Gillian,Nicholas,andSarahNicholls.2012.“AGesturallyControlledImprovisationSystemforPiano.”InProceedingsofthe2012InternationalConferenceonLiveInterfaces.

Hong,DaeRyong,andWoonSeungYeo.2013.“Laptap:LaptopComputerasaMusicalInstrumentUsingAudioFeedback.”InProceedingsofthe2013InternationalConferenceonNewInterfacesforMusicalExpression,233–36.

Novello,Alberto,andAntonyRaijekoff.2015.“APrototypeforPitchedGesturalSonificationofSurfacesUsingTwoContactMicrophones.”InProceedingsofthe2015InternationalConferenceonNewInterfacesforMusicalExpression,170–73.

Pavlovic,Vladimir,ThomasHuang,andRajeevSharma.1997.“VisualInterpretationofHandGesturesforHuman-ComputerInteraction:AReview.”IEEETransactionsonPatternAnalysisandMachineIntelligence19(7):677–95.

Pinheiro,Eduardo,WolfWeber,andLuizBarroso.2007.“FailureTrendsinaLargeDiskDrivePopulation.”InProceedingsofthe5thUSENIXConferenceonFileandStorageTechnologies,17–18.

Puckette,Miller.1997.“PureData:RecentProgress.”InProceedingsoftheThirdIntercollegeComputerMusicFestival,1–4.

Theberge,Paul.1997.AnySoundYouCanImagine:MakingMusic,ConsumingTechnology.Hanover,NewHampshire:UniversityPressofNewEngland.

Wessel,David,andMatthewWright.2002.“ProblemsandProspectsforIntimateMusicalControlofComputers.”ComputerMusicJournal26(3):11–22.

Zamborlin,Bruno.2016.“Mogees.”http://mogees.co.uk[Accessed:28Apr.2016].

Zamborlin,Bruno,FredericBelivacqua,MarcoGillies,andMarkd’Iverno.2014.“FluidGestureInteractionDesign:ApplicationsofContinuousRecognitionfortheDesignofModernGesturalInterfaces.”ACMTransactionsonInteractiveIntelligentSystems3(4):30–45.