volume/brightness control in touchscreen musical instrument

3
Tonal Brightness/Volume control based on frequency of touches, note position and touch strength Author: Roman Glistvain Abstract There are very few (if any) smartphone apps which provide a guitar like accompaniment musical instrument functionality. This paper describes an algorithm (patent application in progress) which makes it possible to play accompaniment where the volume of the instrument closely and naturally follows the song. Therefore making it possible to provide a realistic playing style where the player can naturally without making mistakes and the volume/brightness of the song follows the player's voice. Introduction TravelGuitar/Брынчатель is a smartphone musical instrument[1] which plays a repetitive note progression or a strumming sequence (depending on a style) as a result of a single touch&hold. Within the above sequence there are “preemption points” during which the user can release and press the same or different chord without interrupting the flow of music. If the same chord button is pressed – the playback jumps to another position in a sequence (e.g starting with the next bass). These preemption points serve as a grid to lock to if the user releases/presses the button a little earlier. The purpose of TravelGuitar is to provide accompaniment for singing various songs. When trying out my instrument with variety of songs I noticed that the sound is fairly plain and doesn't follow the voice because the volume of the voice changes within the song but the instrument doesn't change the volume. I tried using accelerometer to determine the strength of touching different notes and based on that strength to select the different volume and/or different sample of the same note but the accelerometer accuracy was not great[2]. During my experimentation I discovered that when I sing louder, I press buttons with higher frequency and often with higher pressure. On the smartphone, the touch frequency can be calculated precisely but the touch pressure can not. However, using an algorithm which combines the touch frequency with the rough measurement of touch strength as well as some additional datapoints results in a calculated sound volume which closely follows the voice. This is where my algorithm of “Tonal Brightness/Volume control based on frequency of touches, note position and touch strength” comes into play (patent application in progress). Algorithm When a user “touch” is registered, several associated datapoints are recorded. The following is an example set of recorded datapoints for each touch: 1) The position within the currently playing sequence where the touch has occurred (specific preemption point)

Upload: roma

Post on 09-Nov-2015

10 views

Category:

Documents


0 download

DESCRIPTION

This paper describes a unique algorithm (patent application in progress) which allows to control the sound brightness based on number of datapoints such as frequency of touches, position of the note, strength of the touch.

TRANSCRIPT

  • Tonal Brightness/Volume control based on frequency oftouches, note position and touch strength

    Author: Roman Glistvain

    Abstract

    There are very few (if any) smartphone apps which provide a guitar like accompaniment musical instrument functionality. This paper describes an algorithm (patent application in progress) which makes it possible to play accompaniment where the volume of the instrument closely and naturally follows the song. Therefore making it possible to provide a realistic playing style where the player can naturally without making mistakes and the volume/brightness of the song follows the player's voice.

    Introduction

    TravelGuitar/ is a smartphone musical instrument[1] which plays a repetitive note progression or a strumming sequence (depending on a style) as a result of a single touch&hold. Within the above sequence there are preemption points during which the user can release and press the same or different chord without interrupting the flow of music. If the same chord button is pressed the playback jumps to another position in a sequence (e.g starting with the next bass). These preemption points serve as a grid to lock to if the user releases/presses the button a little earlier.

    The purpose of TravelGuitar is to provide accompaniment for singing various songs.When trying out my instrument with variety of songs I noticed that the sound is fairly plain and doesn't follow the voice because the volume of the voice changes within the song but the instrument doesn't change the volume. I tried using accelerometer to determine the strength of touching different notes andbased on that strength to select the different volume and/or different sample of the same note but the accelerometer accuracy was not great[2].

    During my experimentation I discovered that when I sing louder, I press buttons with higher frequency and often with higher pressure. On the smartphone, the touch frequency can be calculated precisely but the touch pressure can not. However, using an algorithm which combines the touch frequency with the rough measurement of touch strength as well as some additional datapoints results in a calculated sound volume which closely follows the voice. This is where my algorithm of Tonal Brightness/Volume control based on frequency of touches, note position and touch strength comes intoplay (patent application in progress).

    Algorithm

    When a user touch is registered, several associated datapoints are recorded. The following is an example set of recorded datapoints for each touch:1) The position within the currently playing sequence where the touch has occurred (specific preemption point)

  • 2) The chord button that was touched (the same chord or the new chord)3) Strength of the touch (very rough)4) Frequency of touches

    4.1) was the previous preemption point pressed.4.2) overall frequency of touches over the last K touches (number of preemption points during

    which the user touched a button vs overall number of preemption points).

    The above datapoints are passed to the style's specific calculation function which uses current datapoints associated with the current touch along with datapoints associated with M previous touches in order to compute the tonal brightness/volume for the next sound sequence to be played. Tonal brightness is accomplished by using a different sound sample of the same note/strum element which has different contrast to the human ear.

    The above function can be implemented as a state machine. Different sets of states represent different sound levels within the song. Each style implements a different state machine.

    An example of such state machine is the following:

    The state machine diagram describes a style which consists of:1) first bass sound (preemption point)2) sequence of notes/strums (some styles have preemption point(s) here)3) second bass sound (preemption point)4) sequence of notes/strums. (some styles have preemption point(s) here)

    Pressing the same chord results in a transition to the other bass within the same chord and keep playing from that position.

    Low volume state

    Low touch on first or second bass, Volume = 0.4

    Regular volumeFirst or secon

    d

    bass strong, volum

    e- 0.8

    Light touch on first bass, volume =0.4

    Light touch on second bass, volume 0.7

    High Volume

    No skips and average touch strength, or low touch on the second bass, volume 1.0

    High frequency over the

    last 4 touches

    and First bass

    strong, Volume 0.9

    Not maximum frequency of touches or low strength on the first bass,volume 0.7

  • As you can see in the above state machine datapoints recorded along with the touch are used to performstate transitions and to select a new volume. This volume determines both the volume of the sample as well as selection of a specific sample of the same note (brighter/duller).

    The historical data is integrated into the state itself . To get to a specific state, certain events needed to have happened in the past.

    Using this approach allows to achieve repetitive performance where the instrument is aligned well with the emotions of the singer. Real state machines for implemented styles are much more complicated than the above example. This example is just used to describe the algorithm.

    References

    1) TravelGuitar Android App - https://play.google.com/store/apps/details?id=com.brynchatel.ad&hl=en2) http://141.84.8.93/pubdb/publications/pub/essl2010pressuremusic/essl2010pressuremusic.pdf

    AbstractIntroductionAlgorithmReferences