sphearical interactions - joep elderman · line with a richer and more tangible user experience ....

28
1 Sphearical Interactions Master Graduation Project By Joep Elderman

Upload: others

Post on 22-Jul-2020

4 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

1

Sphearical InteractionsMaster Graduation Project

By Joep Elderman

Page 2: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

2

ContentAbstract 3Introduction 3Theoretical Framework 4Process 61.1 First Itteration

1.2 Second Itteration

1.3 Third Itteration

1.4 Final Itteration

6

6

8

9

Final Prototype 101.0 Concept

1.1 Design

1.2 Mechanics

1.3 Electronics

1.4 Software

10

10

12

15

16

Client Evaluation 20

Acknowledgment 23References 24Image list 25Appendix A Interview Quotes 26

Discussion 22

Page 3: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

3

IntroductionThis report describes my journey towards designing a remote control for a media centre that consists solely of a handheld sphere. It brings together my fascination for tangible interac-tion design with the control of media. The project has a quite straight forward but interesting constraint, the physical shape of the remote has to be a sphere with no buttons or visible con-trols. This constraint brings with it some interesting challenges on which I will elaborate in depth in this report.

I do not claim that these constraints will still allow me to design a commercially viable and practical remote control.Instead, I am interested in using other input and feedback modalities like gestures and haptics to see if this can still result in a usable remote control. The lessons from this rather radical project can be applied to further enrich a (remote) control experience in a context where the above constraints do not apply.

I created the brief for this project together with Lyle Clarke who is a senior UX designer at Bang&Olufsen. B&O is a Danish high-end Hi-fi company, known for both outstanding audio performance and eccentric form-giving and interaction design. I proposed them to do a redesign of a product in their product line with a richer and more tangible user experience . Together we initiated the spherical interactions project

AbstractAn affordance is a relation between a user and an object that exhibits the possibility of a certain action. Naturally these action possibilities are communicated primarily through a visual modality. This graduation project aims to use non-vi-sual modalities to communicate action possibilities and give feedback in a dynamic way. Doing this dynamically means that action possibilities will be presented only when needed. The above is used to create a Spherical remote control for a Bang & Olufsen media centre. Through the use of haptic effects two different interaction paradigms are developed that do not rely on the visual modality. In one model haptic effects are used only in a response to human action. In the second one, haptics is used to permanently communicate the state of the system and the user is given the opportunity to modify this state in an embodied way. Together these models demonstrate the potential of dynamic haptic affordances and contribute to the research on intuitive interaction.

Page 4: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

4

Theoretical FrameworkThis project started with a simple question: is it possible to create a fully spherical device that is an intuitive remote control for a media centre? Answering this question requires a definition of intuitive, which I define as having the ability to know or understand things without any proof or evidence[1]. In an ideal case, this means that a user can pick up the remote control and use it immediately to its full potential. Products that have this property of intuitiveness usually comply with Gibsons theory of affordance[2]. Affordances are defined as a relationship between an object and its user that exhibits the possibility of some action. For example, a cord affords for pull-ing but not for pushing. When one sees a cord for the first time this ability to pull is immediately apparent, we could say it is intuitive. It is important to note the significance of the visual modality in perceiving affordances. In the example of a cord, it is primarily through seeing that the user knows it can be pulled and not pushed. Trough the visual modality the cord com-municates an action possibility to the user. Affordances that concern design features that help users do physical actions are called physical affordances[3].

Physical affordances in spherical shapes are hard to define universally because they vary based on size and material of the object. A small sphere can be held with just fingers while a larger sphere would need an entire hand. A sphere made of a spongy material affords to squeeze while a metal one does not. A generic property of spheres is their perfect symmetry but this can be problematic when physical affordances are concerned. A sphere has no defined top or bottom nor a single right way of holding. This makes it hard to use Physical affor-dances on a sphere.

Up until this point, we have looked at physical affordances of

spheres without taking into account using them as a remote control. Regular remote controls just like most electronics rely almost exclusively on cognitive affordances[3] e.g. a button label that helps users know what will happen if they click on it. This means that especially when a user is new to a product he has to look at the labels to find the right action possibility. With practice the user will eventually learn the functions of the most common buttons and be less likely to look before a certain action. But since we are talking about a remote control it is arguably not desired to look at the object at all since the action the user is interested in takes place somewhere else entirely. If the created remote control would have no buttons at all there would be no need for cognitive affordances on the device. This means the user can focus on the content he is watching instead of the control interface. Even though the user no longer needs to look at the device there is still a process of learning to operate a device. The length of the learning process can be greatly reduced by making extensive use of feedback and feedforward mechanisms[4].

What are the right feedback and feedforward mechanisms for a spherical device? To answer this question, we first have to look at what action possibilities are available on a spherical device. As described by Angelini et all [5] the action possibilities with a sphere (or, in fact, any object handheld) can be classified ac-cording to a combination of three relationships with an object: move, hold and touch. For example, a combination of hold and move means moving the object trough physical space. Hold and touch together means gripping an object in a certain way. By assigning functionality to certain combinations of the three relationships, a gestural interaction language is established

Page 5: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

5

[6]. To create intuitive interactions we have to look at the cou-pling between user’s actions and the functionality that is being controlled with respect to the six aspects of natural coupling as defined in the frogger framework[7]. The tighter the coupling the more intuitive the interaction is perceived. However an extreme tight coupling can also be undesirable or even defeat the purpose of the object. For example, the tightest possible coupling in the location modality for a volume control of a speaker would be to place the control on the speaker itself, which in turn defeats the purpose of a remote control. There-fore, other factors like ease of use or ergonomics should also factor in the design process.

In order to have strong coupling between user action and ob-ject reaction there needs to be both feedforward on the user’s action possibilities and feedback on the object’s understand-ing and task execution. As illustrated in the first paragraphs, providing feedforward with a spherical device is challenging. To overcome this issue, I propose to use feedback on a previous action as feedforward on the next action. For example, the inFORM display[8] can react to a user moving a ball on its surface by changing the shape of that same surface, in turn creating new action possibilities for the user. The options for the user to move the ball, depend on where the ball is currently specifically located.

Using haptic[9] and Shape Changing [10] feedback loops this project aims to create feedforward based on previous feed-back. This process makes the resulting feedforward dynamic and allows for presenting action possibilities only when these are actually actionable. For example not presenting a volume control when there is no audio playing. Using this dynamic feedforward, I want to create a spherical remote control for a media centre. Not because a sphere is an ideal shape for a remote control but because this shape forces me as a designer to think beyond the obvious with respect to interaction design.

Apart from using the relationship between the remote control and the controlled device as the basis for affordance, we should also look at the relationship between the spherical remote control and the body of the user. An embodiment[14] of the interaction could be an alternative to a more cognitive gesture based interaction paradigm.

Page 6: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

6

ProcessThe project consists of four design iterations. The first iteration has a very explorative nature, it focusses on the opportunities within haptics and shape change as means of user feedback and feedforward. The goal is to create experienceable proto-types that show different ways of interfacing with specific functionality for example changing the volume. These early prototypes do not have to contain all the functionality that is needed in the end. The output from iteration one is shown to design professionals at Bang&Olufsen during my first visit to their Denmark HQ. In a co-creation session, these designs are evaluated and the most interesting properties of the prototypes are taken into the second iteration. In this iteration, the focus shifts more towards prototypes that allow for more funtionality, with each prototype adhering to a single feedback mechanism. One prototype uses shape changing feedback, so there will be one prototype that uses shape-changing feedback, one that uses haptic feedback and one that has inherent mechanical feedback. These three prototypes were evaluated after the demo day of M2.1 based on feasibility, intuitiveness, usability and if they would contribute to a B&O brand experience. The m2.2 semester contained two more iterations in a single direc-tion that are based on one of the three prototypes of the last iteration of the M2.1 semester. An expert review to investigate the strength of the coupling between the user input and the system’s output and thereby the intuitiveness of the prototype served as a final evaluation.

First iterationIn this iteration, I started by looking at new ways a spherical device could give feedback on user action by means of shape change and haptic feedback. I was looking for ways to commu-nicate back to the user without having to use the visual modal-ity. This worked well especially for the haptic explorations that I did with rotational weight shifting as a means for a playback indicator. In a couple of other explorations, I tried to reproduce this effect by creating a rotational sensation with four vibration motors that were switched on and off sequentially. In my first attempt to do this the frame I made to hold the motors was too rigid and would easily transfer the vibrational energy through-out the frame. To solve this, I Incorporated a simple spring in the design that isolated the individual vibration motors to some extent. This helped in being able to separate the individual vibrations and, therefore, strengthened the rotational sensa-tion, but it was far from perfect. The idea surfaced to make use of phase shifting the individual motors relative to each other to create a movable vibration optimum but this option was not

extensively explored.

The shape changing front progressed somewhat less in this exploratory phase since I struggled with finding ways for the device to remain a sphere before and after the shape change. I looked at Hoberman spheres to be able to dynamically change the size of a sphere over time and did some other experiments to create different interaction surfaces for different tasks.

In the first iteration, I also looked at action possibilities that are possible with spherical devices based on the Move, hold and touch framework for tangible gestures. By documenting many different interactions, I noticed that without explicitly designed feedback no matter what you would do to a sphere never results in perceivable system change.

In an evaluation session at Bang&Olufsen, all these explo-rations were demonstrated and the general directions of the second iteration were determined. We also had an extensive discussion on what direction the project should take. At Bang&Olufsen, the interest for spheres as a remote control comes from their belief in a “health and happiness” direction they want to take for their future products. At the same time, I approach the use of spheres from a more pragmatic pure interaction perspective. The interesting thing is that while our motivation for this interest does not exactly align, the developments I’ve made are interesting regardless of which perspective you take on it. I realized that the use of nonvisual feedback from a device, puts less of a strain on the attentional resources of the user. This more relaxed interaction paradigm could reduce the stress experienced during use, allowing a user to focus more on the content he is watching.

second iterationThe goal of the second iteration was to use the new insights from the exploratory first iteration into a more concrete set of prototypes. From the first iterations, three directions were cho-sen: The first direction would be a continuation of the weight shifting feedback mechanism as a starting point for novel media interactions. Use mechanical mechanisms to show and hide action possibilities only when they are needed. And for a final direction I want to combine haptic vibration feedback with absolute positioning of a sphere to be able to create a navigatable “haptic landscape”. By moving the sphere, the user can feel certain bumps and walls that can guide the user in navigating the interface. All of these 3 directions I wanted

Page 7: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

7

Localized vibrationThe aim of this prototype is to create a rotational sensation with vibration motors. Depending on the speed of sequentially turning the individual haptic motors on and off the speed of the overall rotation can be determined. Faster speeds were perceived as a more convincing rotation compared to the slower sequential driving of the motors. The rotational effect appeared to feel stronger when not looking at the device.

Weight shiftingThe weight shifting design is a direct further development of a previous iteration. It still works by rotating an off centre weight around a central axis. But the mapping has improved. When the weight is rotating at a given speed, it represents a song is playing. If one wants to change the volume of de audio playing the user must swirl the device (similar to swirling a wine-glass). To increase the volume the sphere has to swirl in phase with the already existing mo-tion, effectively amplifying it. In reaction, the weight will spin at a higher velocity. When the user wants to lower the volume the swirling has to take place out of phase and the spinning speed is lowered as a result of the user action. This interaction feels very natural but due to its limited parameters, it can not be easily extended beyond volume control in a coherent manner.

Haptic Spacial ConstraintsThis design consists of a combination of spatial sensing capabilities (accelerometer, gyro sensor and magnetometer) and a vibration motor. The aim is to use the absolute position of the object and create a haptic virtual space around it. This space can contain vibrating haptic bumps and walls. Effectively, this prototype creates a space for the user to operate in that can adapt to what controls are available to the user at a certain time. There could be a different haptic space for basic play and pause controls versus more complex options such as selecting a musical mood. This prototype is still very crude at this point due to some technical limitations of the current sensors. But in my opinion, it shows great potential, because the opportunities for multiple controls are most diverse.

Page 8: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

8

to develop to a level where there would be both different user action possibilities and a variety of system to user feedback. In order to demonstrate the need for feedback on the remote control, I also made a simple prototype that could change the volume of a song by rotating a spherical device but did not have any feedback on the user’s action on the remote control. This device I used to illustrate the overall mapping problem that you have without having feedback at the location of user action.

Third iterationThe third iteration focussed primarily on the making of a navigable haptic landscape. In order to achieve this, I had to come up with a way to exert a net force on the remote control when it hits a haptic wall in the landscape. The problem is that for the exertion of a net force we would need an actuator that is attached to both the remote control and an immovable reference. Since the remote control is a handheld, this was a problem. Research by Jun Rekimoto [11] provided a solution. In his work he has created a virtual force sensation without an external reference. He makes use of the non-linearity of the human perception of force. The human senses are very sensitive to short high impact forces but not to forces with a long duration and a low impact. To test this hypothesis I build a prototype (shown below) which consisted of a strong magnet suspended in the middle of a large coil with springs. By sending an asymmetrical AC current (saw wave) trough the coil I managed to recreate the effect described in Rekimoto’s paper but with a lot more force.

But since the rig was fairly large it was highly unpractical in a spherical remote control. I redesigned the actuator (shown in next colum) and managed to reduce the size significantly by removing the springs and put coils at both sides of a closed tube which contained the magnet. Both rigs performed well with low subsonic frequencies up to about 5HZ. Higher frequencies felt more like vibrations than force pulses with a sense of direction.

So after having established a means to create virtual forces needed to experience haptic walls, the next step was to design the haptic space that the user could navigate. I prototyped this very quickly using a computer mouse with off the shelf vibration motors attached to them (shown above). The goal was to make a layout for the haptic space immediately experience-able( an example of a haptic space is visualized below. A user whould navigate trhough the space to select menu items). I tried many different layouts and control mappings with the haptic mouse. I collected insights through a small user test. I first explained what I was trying to achieve and then showed users some different mappings that they could try to navigate. This went well when users also had on screen feedback on their action but when users had to navigate the space only relying on haptics users quickly got lost and were unable to navigate the created space.

Page 9: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

9

At this point, I realised this was highly problematic for my concept. I had assumed that it would be easy to construct a mental model of a haptic space by bumping into these artificial haptic walls, but this was not the case. I realised that vision is a sense that is highly unique in its ability to create an overview of a space. Without experiencing hitting a visual wall vision can proactively avoid hitting that wall by steering away. When a user has to rely on haptics there is no way of experiencing the tactility of the entire space at once but the user has to rely on feedback received from hitting individual walls in order to construct a mental model of the haptic space in its totality. This takes much longer and is very prone to perception errors by the user.

Final iterationThe interaction paradigm used so far, with location based re-active haptic feedback, did not allow for an intuitive mapping. Therefore, the final iteration, focused on developing an alterna-tive interaction paradigm. The most important insight from the last iteration is to make an interface that relies solely on haptic feedback, reactive feedback alone is not enough. Most often haptic feedback is used to confirm an action the user just performed. The haptic walls I envisioned are no different, they tell you that you have hit a wall not that you’re about to hit one. The Dynamic and proactive means of communicating the state of a system used in Philip Ross’s carrousel [12] inspired a new direction on how haptics could be used to communicate the current state of the system(system is shown below). By proactively communicating the state of a system instead only reacting on the user’s action, an awareness is created of the current state of the system. In the case of the carrousel a spinning wheel is used to communicate the energy in a room but this same spinning wheel can also be used to change the energy of the room by slowing it down with your hand.

Much like the Carrousel I wanted to use rotation as a feedback modality but at the same time use rotational movements as gesture input modalities. For example, a remote invites

you by very subtlety rotating to perform a rotational gesture, which will initiate playback. Once playback is initiated the rotational sensation would increase in strength or velocity and thereby provides feedback on the successful performance of the gesture. If a user wants to stop playback it should try to counter the rotational haptics of the device by performing a gesture in the other direction. The force effects explored in the previous iteration can be utilised to signal the selection of next and previous tracks by giving a force impulse in the associated direction. I was very happy with this more embodied approach to remote control and decided to prototype this next to the previously explained a reactive haptic setup so I could show both ideas to the client and discuss pros and cons.

Page 10: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

10

Final PrototypeConceptThe final Prototype should serve as a demonstrator of the concepts that were developed in the last two iterations. Therefore, the prototype should contain both the embodied and spatial navigation interaction paradigms in a single remote control.This will enable comparing both and allows to give a comprehensive demonstration of the haptic qualities that can be achieved within each paradigm.

Spatial navigationThis concept is related to the work in iteration three. It uses the movement of the spherical remote control relative to a central starting point to select control options. A linear haptic actuator is used to provide the user with feedback about what option he has currently selected. The feedback in this mode is strictly reactive and there is no feedforward.

For demonstration purposes, users will be able to navigate a simple menu structure either with additional visual feedback or without. Every menu item has a distinct haptic sensation that separates it from the other items, and aids in forming a mental map of the space.

Embodied control.This mode relies on a variety of circular motions performed by the user to initiate action. The feedback is provided by a motor with a weight attached to it at one side. When the motor spins the weight creates a gyroscopic effect. This effect causes the hand to move in the same direction as the gesture was performed. To showcase this interaction paradigm I made a content playback example. When the user picks up the remote control the gyroscope makes one 360 degrees rotation in the direction of the playback gesture to feedforward to the user his action possibility. After the user has made a clockwise ro-tational movement music starts playing by slowly ramping up the tempo of the song, at the same time the gyroscope starts ramping up in speed as well, it rotates in the same clockwise direction as the gesture. If a user wants to skip to the next song the user has to make a rotational clockwise movement that is faster than the current speed of the gyroscope.

The way to skip to a next song in the current implementation was a result of a small user test in which I asked how people would change to the next song after explaining to them how to start their initial song. two of the four users, I did this with used linear swipe gestures while the other half used the same

circular motion they used for starting the first track. Since there was not really a dominant preferred way and the linear swipes did not fit as well in the interaction paradigm the circular motion for next was chosen. To stop playback of a song a user had to make a counter-clockwise motion and work against the gyroscopic motion.

DesignSince the idea was to use a sphere without any buttons as a remote control I initially thought there weren’t that many aesthetical design considerations to be made. Nevertheless, I wanted the device to have a high quality feel to it and to make it resonate with the Bang&Olufsen brand identity. Because I wanted the user to be able to distinguish the top and bottom of the device I decided to make it from two different materials. I chose for American walnut for the top part and anodised alu-minium for the bottom half. Both materials have a very distinct tactile feel mainly due to how heat is transferred from your hand to the material. This property allows users to feel if they have the right side up or not without having to look at it. The bottom of the sphere is slightly flattened to enable the sphere to rest on top of a table without rolling over. The top is also slightly flattened to allow for the B&O logo to be placed on it. The logo inlay is made of the same material as the bottom half of the sphere.See fig.on the next page for the final design.

At the moment of writing the aluminium parts have not been milled yet and is being substituted by Eastern Cottonwood. This is a very light wood, but unfortunately not very different from a tactile perspective. Initially, I wanted the cutting line between the materials to be straight, but after some experi-mentation, I decided that a curved cutting line would be much more aesthetic.

Page 11: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

11

Page 12: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

12

MechanicsDesigning the mechanical aspects of the remote control was quite challenging since the device needed to contain multiple sensors and actuators in a really tiny form factor. Because of this, my ambition to make the device wireless was scrapped, there simply was not enough room for it. In stead, a ground plane was added that provided tracking sensors for the sphere. The sphere has an outer diameter of 70mm, the largest dimen-sions to ergonomically fit in small hands. Inside the wooden outer shells, there is a 3d printed inner shell. This inner shell is needed due to limitations in the 3 axis CNC milling process that did not allow for the required overhangs. The inner shell consists of two parts that form fit together (see fig. below, shown in white and red). These two parts hold all the function-al components of the prototype and snap to the outer wooden

shells with magnets. In the bottom of the inner shell, the gyroscope mechanism is mounted. This mechanism consists of a geared DC motor, a ball bearing, and a connecting arm with weights to cause the gyroscopic effect. The weights are made of 1/4 inch tungsten cubes, one of the densest materials (19.25 g/cm3) available at a reasonable prize The arm is made of casted bronze to add weight to the gyro effect. On top of this assembly rests a diametrically magnetised magnet, centred on the axis of the motor. This allows a magnetic rotary encoder to track the position of the arm. The cables for the DC motor lie in cavities in the side of the inner shell, this prevents the moving arm from damaging the cables and cutting its own power. The bottom shell also contains 4 small magnets that allow the bottom shell to be attached to the top outer shell. The top part of the inner shell holds the PCB of the magnetic rotary encoder

and the linear haptic actuator that is used for the alternative interaction paradigm. The top part also features two magnets to attach the part to the top outer shell. For an overview, see the pictures on the next page.

Page 13: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

13

Page 14: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

14

Top Shell

SEnsor

gyroscope

Bearing

lower shell

Haptuator

Inner shell

Motor

Inner shell

Page 15: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

15

Electronics The electrical composition can be subdivided into three parts. The hall effect (magnetic) sensors are used to track the motion of the ball relative to the array of hall effect sensors that are embedded in the ground plane. Data generated by these sensors is sent to the computer for processing, more on this in the software chapter.

The linear actuator used for the menu browsing demo is called the Haptuator [13] and is developed by tactile labs in Canada. It is basically a miniaturised version of the linear actuators I build in the third itteration but because the mass that it moves around is very small it is not capable of achieving the same effects. To Drive the haptuator I used a one channel class D amplifier that is able to output 12 watts of power on a contin-uous sine wave. This is more than enough to create the haptic effects needed for this project. The amplifier I chose to use takes in line level audio. This means I could use waveforms generated by my computer and outputted through the head-phone jack to drive the Haptuator. This made experimenting with a large variety of haptic effects very easy.

The gyroscope setup in the bottom of the inner ball is driven by a transistor based H-bridge which allows it to be turned both clockwise and counter clockwise. This bidirectional ability is also used as a breaking mechanism by applying reverse current onto the motor until it stops. To monitor the position of the motor a high precision magnetic rotary encoder was used with 4096 steps per 360-degree rotation. The readings of this sensor were used to perform hard stops of the gyro at

any angle but is were also used to keep the motor turning at a set angular velocity regardless of outside influences this PID is explained in great detail in the software section.

MACComputer

TransistorH-bridge

Hall effect sensors

Class Damplifier

Multiplexerarray

linear actuator

Arduino Mega

Rotoryencoder

Arduinoteensy

DCMotor

Hardware Overview

Sensor/actuator

Controller

Other

Page 16: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

16

softwareThe software that runs the prototype was built in a highly segmented way as can be seen in the software diagram. This segmentation provides a high degree of flexibility and the potential for experimentation by changing, adding or removing parts of it. in the next paragraphs, I will describe the function of all software elements starting with the parts that determine the position of the sphere.

Signal multiplexerThe position of the remote control is determined by tracking a magnet in the bottom of the sphere on top of an array of Hall effect sensors (magnetic direction sensors). The signal multiplexing step selects individual Hall effect sensors from the Hall effect array trough a set of multiplexers and reads their individual X,Y and Z magnetic values. These values are sent over a serial bus to the computer. Faster readouts of the entire array result in a more frequent update of the position of the sphere and thereby improves accuracy. The array is read with a frequency of 400HZ.

Hall effect integrationThe goal of this module is to take the input stream of Hall effect data and output an estimation of the position of the magnet that is being tracked. For this, an algorithm is used called Gauss Sense[x] which was developed by Rong-Hao Liang et all. On start up the system is calibrated by assuming that the current magnetic state of the system is an idle state. This way all static magnetism induced by other magnets laying around the hall effect array are cancelled out. The changes relative to this idle state induced by the tracking magnet in the bottom of the remote control are recorded. From these change vectors of the hall effect sensors, a centroid is created at the location the vectors point at.

Because a special diametrically magnetised magnet was used both the positive and negative pole of the magnet where visible to the hall effect array, because of this not only the position of the magnet could be derived from the data but also it’s rotation and pitch relative to the board. The extended

position information created in this step is transferred over the UDP network protocol to both the gesture interpreter applica-tion and the menu demo application whenever the position of the magnet has changed.

Menu DemoThis simple Demo application allows a user to navigate a menu both in a visual way by moving the remote control to various positions and in a purely haptic way by feeling different haptic sensations with each menu item. The position data from the hall effects integration is used to determine on which button the user is hovering and what haptic effect should be played. The haptic effects are short audio files that are played back through the haptuator linear actuator.

Gesture interpreterThe goal of this application is to convert the position data input stream into discrete gestures with a high degree of accuracy. In order to do so, we need unique qualifiers for every gesture. because most gestures in the music playback example rely on rotation, the qualifying metrics that were chosen are linear velocity, linear acceleration, angle relative to a centre point and angular velocity. In order to create the linear metrics, the first derivative of velocity and the second for acceleration need to be calculated. Since the input is a stream of data and not a given formula these derivatives have to be created in a different way. With the formulas below the first and second derivative can be calculated from the position data stream. f(x) represents the data stream of position vectors, in which x is time and is incremented for each incoming sample. 2h is the

sample interval for which the derivative is taken.The first and second derivative are qualifiers for linear constant motion and accelerated motion (swipes). In order to discrim-inate rotational movement from linear movement angular velocity of the remote control is needed. since rotation is always relative to the static centre point we first have to define this point. Initially, the centre of the hall effect sensor array was used for this purpose. But this did not allow for rotational movements in a corner of the sensor array. Therefore, the centre point was set dynamically.

Signal multiplexer

Gesture Interpreter

Hall effect sensors

Hall effectintergration

Music playbackDemo

MenuDemo

linear actuator

Signal interpreter

Rotoryencoder

PIDController

DCMotor

System I/O

Embedded Firmware

Software

Software Overview

f "(x) ≈ f (x + h)− 2 f (x)+ f (x − h)h2

f '(x) ≈ f (x + h)− f (x − h)2h

Page 17: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

17

The position p of the remote control was averaged over N samples. N was set to have an averaging interval of about 4 seconds.

In a fairly straightforward manner, the position of the centre and the current position of the remote control were used to calculate the angle alpha between them. finally, the same derivatives used for the linear motion where applied to the

angular motion to derive velocity and acceleration.In a final step, the derived metrics were used to differentiate between certain gestures by assigning probability values to each gesture that could be heightened or lowered based on the values of the derived metrics. When the probability value of a given gesture becomes higher then the treshold, the system assumes this to be the active gesture. For example, if the angular velocity metric is high the forward turning (playing) gesture’s probability is increased. if this angular velocity is sustained for a longer period of time the periodic probability in-creases will go over the threshold and the gesture will become active.

Once the active gesture has changed it is communicated over UDP to music playback demo application to act on the new gesture information

music playback demoThe goal of this application is to react on gesture information provided by the gesture interpreter and play music accordingly. It also sends commands to the motor controller in the ball to start and stop rotation.

To strengthen the feedback coupling between the playing of music and the rotation of the weight both use a ramping effect on their speed. So when a song starts playing it speeds up from 0% to 100% in 1 second and in that same time the motor speeds up as well. This ramping effect is reversed when playback is stopped. The commands to the motor controller are given over a serial connection to the embedded processor.

PID Controller & signal interpreterThis program executes commands from the music playback demo by controlling the motor that drives the weight shift-ing feedback. The DC motor that is used is connected to an

H-bridge which allows for movement in the clockwise and counterclockwise direction. The motor is also PMW controlla-ble which allows for the precise setting of the force and speed the motor delivers.

The weight shifting haptic effect is very delicate and easily disturbed. For example, when the motion of the weight was held not entirely levelled there would be an experienceable acceleration when the weight moved “down hill” and a deceler-ation when it moved “uphill”. Because there is a rotary encoder that monitors the rotational position of the weight we can derive the speed of that motion in the same way as done in the gesture interpreter. this speed parameter we can use for a PID controller (proportional–integral–derivative controller). The basic idea is that we adjust the PMW signal to keep this speed parameter constant this compensates for unwanted accelera-tion effects of the motor due to external causes.

e(t) represents the input error values over time (desired speed - actual speed). The control variable that sets the amount of correction needed to the PMW pulse-pause ratio to compen-sate for the error is U(t). The equations consist of three terms of which the first is the proportional element which reacts on current differences between U and e. The second integral term accounts for past errors and allows the controller react on large error values in the past and compensate longer. The final derivative term accounts for possible future values of error by looking at the current rate of change of the error this tries to reduce overshoots. The K values coefficients change the effects of the each respective term and are set by trial and error to achieve smooth results with the given hardware setup.

Apart from controlling the motion in a rather smooth way, the controller can also abruptly reverse the current on the motor for a short period of time to make it instantly stop. By using the position feedback the rotary encoder provides these hard stops can be done at any angle to create a semilinear momentum at that angle.

p!"centre =

1N

p!" i

i=1

N

α = tan−1 pxmagnet − pxcentre

pymagnet − pycentre

⎝⎜⎞

⎠⎟

p!"=

xyz

⎢⎢⎢

⎥⎥⎥

u(t) = Kpe(t)+ Ki e(τ )dτ + Kd0

t

∫de(t)dt

Page 18: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

18

Page 19: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

19

Page 20: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

20

EvaluationClient EvaluationSince this project was a very close collaboration with B&O I wanted to take the time to investigate their expert opinion on the outcome of and the reasoning behind the project. The only way for me to do this was to plan a second visit to their headquarters in Denmark and be able to demonstrate and let people experience the prototype. On my visit from 29th till the 31st of may, I gave various presentations to different target audiences ranging from the UX team, marketing to production engineering.

After each presentation, I allowed people to play around with the prototype to get a feel for the interactive qualities it had and had conversations with them focused on the pros and cons of the two proposed interaction paradigms. Next to that I performed semi-structured interviews with 3 UX experts that I recorded and subsequently transcribed (appendix A contains all important quotes from this transcription). I clustered some of the quotes from the interviews below based on subject.

GeneralParticipant one said: “the current state of the project lets us explore some things physically and emotionally instead of as ideas on paper. It helps us to understand the potential of the interaction paradigm and the haptic feelings associated with it” and Participant two said: “we were really excited to try it out because from your explanation it is hard to grasp the haptic sensation it provides. On the haptic level, it works really well in providing a convincing experience.”

But there was also critique on putting two interaction para-

digms into one physical prototype, Participant one said: “You have tried to put many different things into one prototype that resulted in some constraints like being confined to this little interaction area, this might have been less of a problem when you would have created 2 or more prototypes that divide the current functionality. But the interesting thing is that these constraints also inspire ways of looking at this that might not have occurred when the prototypes would have been separat-ed”

Pragmatic controlI asked the participants if they experienced the remote control as a pragmatic solution for media control compared to a more traditional remote control. Participant one and two did not agree on this subject, participant one said: “I think that at this point the limitation of holding it so close to the surface is so great that it only conceptually is pragmatic, the choice of the technology sort of binds the human motion a little. to such a degree that you at this point really have to think about this constraint. It is hard to imagine how this would be in real lif without having the constraint relative to the board” while par-ticipant two said: “at this point it falls somewhere in between a pragmatic and a conceptual remote control. I do think you get a good sense of how it works even with current the board constraint especially with the playback example”.

Haptic qualitiesParticipant 3 said: “your application of haptics is unique in the sense that unlike most applications of haptics you do not try to replicate existing sensations like the feel of a mechanical switch but you created a paradigm that stands on its own. this makes it really fun to explore. I do think that your mech-anism has some real world feasibility issues in their current implementation with regards to power draw and space needed for the mechanism” and later said: “I think the audio ramping effect that you created together with the motor starting up really signal the connection between these modalities well” Participant 1 said on the same subject: “The fun of the starting and stopping gesture is very good I’d wonder if you could further build on that”

Embodied interaction paradigm vs spatial navigationWhen discussing the embodied playback demonstration I gave participant two noted the following on how to know what is the starting gesture: “I was wondering if you could give some kind of feedforward where it would move slightly to signal the

Page 21: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

21

user what gesture to use” . This feedback I used to implement an initial rotational movement when the user picks up the ball from the platform.

On the difference between the two interaction paradigms that were demonstrated participant one noted the following: “spatial navigation paradigm is much more shape agnostic and could just as well be applied to a wand shape for example” and participant two said “for me the more embodied approach is much more interesting, it is a much more natural use of the shape of the ball. In the other one, it was really nice to feel the different textures when I browsed the menu items but I was wondering why am I holding a ball, it did not resonate with the shape.” When asking why this difference was so prevalent participant one noted: “I think with the embodied approach you keep the ball as an object and there is an interaction between you and the ball but in the other example the ball tries to disappear because it is not conceptually necessary to be there. And it that case you can wonder if it is an ideal shape since you are basically pointing at things.”

PracticalityI asked the participants if they say practical commercial applications for this work P1 said: “ I think that well working pragmatic controls that you’ve implemented in a unique way could be enough reason to do this, Everything does not need to be the same. You have to attract people that are interested in an alternative maybe more relaxed control scheme.” while P2 was a bit more conservative and said: “ I think to justify having such a haptic actuator in our remote control we have to come up with additional uses for it. this could be something playful or something meditative for example. But P2 also said, “it is also a beautiful object and way of interacting for having in your living room especially compared to a more traditional remote control”.

Future developmentP1 said: “I think from this point on it would be interesting to explore both interaction paradigms separately without the ball as mandatory form factor. It would be interesting to take your ideas on embodiment and apply them to our pattern play and more music-specific ideas. For the spatial haptic navigation, I would like to see what other form factors and additional feed-back modalities could benefit from this.” P1 then elaborated on an IOT application for the device: “I think it could also be very interesting to couple your remote with voice control to make it context aware and work in an IOT context” participant 3 then said: “There is an interesting that the embodied approach feels more like a partnership with technology than a controller in much the same way interacting with a voice interface like Siri feels more like a partnership. I think that therefore these two technologies could really extend each other.”

ConclusionOverall the participants of the interview and the other B&O employees were very enthusiastic about my concept and appreciated the quality and variety of haptic sensations that I created and the interaction paradigms I constructed with these haptic sensations. The final prototype was perceived as playful and the playback control was perceived as quite intuitive. The stronger action-perception coupling facilitated by the embodied approach was clearly noticed and appreciated by the participants. My suspicions that the spatial mapping would be less effective due to its total lack of feedforward were confirmed by the participants. I discussing how the embodied approach could be brought further interesting similarities with speech interfaces were discussed, and this could possibly serve as a means to make the proposed interaction paradigm more broadly applicable. The lessons learned from this project can easily be used in either a continuation of the use of spheres as a remote control or even in a broader sense to other haptically enriched projects.

Page 22: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

22

DiscussionOverall, the project has provided interesting insights due to the unusual starting point and chosen constraints. The choice to use a sphere as a means of interaction with media allowed for exploring feedback and feedforward opportunities that would not have been explored in a project without this starting point. Haptic feedback is often merely used as an additional and therefore non-essential feedback modality, in stead, this project stimulated exploring haptics as a primary feedback modality. Because the user depends on the haptics, the quality and degree of expression needed to be far higher than with haptics as an additional modality. The high level of haptic fidelity achieved in this project is a consequence of its necessity. This lesson can be generalized to other modalities in designing feedback. Imagining it is the only communication channel you have towards a user, will stimulate using the full bandwidth that modality provides. Combining multiple feedback modalities that are designed with such a process, could give interesting results. Will they add up to a richer multimodal feedback experience or will they compete for the user’s attentional resources?

Feedback & feedforwardThrough the focus on haptics, the explorations have also given insights about other feedback modalities. The constrains of haptic feedback showed that with respect to information band-width not all feedback modalities are created equal. The visual modality for example, provide a far broader set of possibilities. Especially information with a predictive element to, indicating something is about to happen, or information that provides an overview or orientation is easy to implement visually but nearly impossible with some other modalities that do not have the same bandwidth.

With respect to feedforward, the concepts demonstrate that it is possible to create haptic feedforward in the embodied audio playback application. The embodiment is key in achieving this because the haptic rotational effect is used to literally com-municate to the user the rotational gesture that he can make

at that time. In this project, haptics were used proactively to communicate action possibilities instead of only reacting to user action with haptic feedback.

Application / Translation to practiceThe evaluation showed the big difference between the strictly reactive haptic feedback approach relative to the embodied ap-proach which has both feedback and feedforward. Next to this the evaluation also showed that the embodied approach has a much stronger integration between the shape of the remote and the control gestures. By using the same ramping effects on both the audio and haptic modality I achieved a much tighter coupling between these modalities In accordance with the frogger framework. It also proved to be a fun and exiting experience which was valued by the client.

I think I managed to convince The UX team of B&O of the benefits of embodiment in interaction and haptic feedforward as important means for the future of interaction design. In the introduction, I wondered if you could make a commercially viable remote control given the constraints of this project. The conclusion of this project is that it is defiantly possible to create a spherical remote control without physical buttons but that the functionality you can control with it is inherently limited without making the mapping to cognitive. The haptic qualities designed in this project are in my opinion better in combination with other feedback modalities, as can be seen from the audio ramping effect.

Page 23: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

23

Acknowledgements In my opinion, this report shows that a vast amount of work went into this project but not only by me, a lot of people have supported me through this process without them it would not have been possible.

First off I want to thank Ditte Hvas Mortensen and Lyle Clarke from Bang & Olufsen for the opportunity of this collaboration and the extensive feedback that I received from them through our weekly skype meetings and at my two visits to Denmark. My gratitude extends to the rest of the UX team for their valuable feedback during the demos I gave at B&O.

Another big contributor to the project was my coach Joep Frens who helped me see the bigger picture when so many things were happening at the same time and for his interesting thoughts on rich and embodied interaction

I wish to thank Frank van Valkenhoef for his help in soldering very tiny pesky SMD components, His insightful ideas in electronics design and for saving my ass by having a spare

motor to replace the one I destroyed a couple of days before my final visit to B&O.

I’d also like to thank Jasper Sterk and Chet Bangaru for their help in 3d printing parts for the CNC milling of the wooden enclosure which really made the prototype stand out!

I’d like to thank Bram Naus for keeping me company on the long drives to Denmark and also for being a great discussion partner on the specifics of the project.

I’d also like to thank Ine Mols for the valueble discussions we had on the fundamentals of good interaction design and for proof reading large parts of this report.

And last but not least I’d like to thank Hannah Keulen for her mental support and the well needed redrawing of some of my illustrator work.

Page 24: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

24

1: Definition intuitive. (n.d.). Retrieved January 7, 2016 www.merriam-webster.com/dictionary/intuitive

2: Gibson, J. J. (1977). The theory of affordances. Hilldale, USA.

3: Hartson, R. (2003). Cognitive, physical, sensory, and functional affordances in interaction design. Behaviour & Information Technology, 22(5), 315-338.

4: Djajadiningrat, T., Overbeeke, K., & Wensveen, S. (2002, June). But how, Donald, tell us how?: on the creation of meaning in interaction design through feedforward and inherent feedback. In Proceedings of the 4th conference on Designing interactive systems: processes, practices, meth-ods, and techniques (pp. 285-291). ACM.Chicago

5: Angelini, L., Lalanne, D., Hoven, E. V. D., Khaled, O. A., & Mugellini, E. (2015). Move, Hold and Touch: A Framework for Tangible Gesture Interactive Systems. Machines, 3(3), 173-207.

6: Vaucelle, C., & Ishii, H. (2008, September). Picture this!: film assembly using toy gestures. In Proceedings of the 10th international conference on Ubiquitous computing (pp. 350-359). ACM.Chicago

7: Wensveen, S. A., Djajadiningrat, J. P., & Overbeeke, C. J. (2004, August). Interaction frogger: a design framework to couple action and function through feedback and feedforward. In Proceedings of the 5th conference on Designing interactive systems: processes, practices, methods, and techniques (pp. 177-184). ACM.

8: Follmer, S., Leithinger, D., Olwal, A., Hogge, A., & Ishii, H. (2013, October). inFORM: dynamic physical affordances and constraints through shape and object actuation. In UIST (Vol. 13, pp. 417-426).

9: Srinivasan, M. A., & Basdogan, C. (1997). Haptics in virtual environments: Taxonomy, research status, and challenges. Computers & Graphics, 21(4), 393-404.

10: Rasmussen, M. K., Pedersen, E. W., Petersen, M. G., & Hornbæk, K. (2012, May). Shape-changing interfaces: a review of the design space and open research questions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 735-744). ACM.

11: Rekimoto, J. (2014, July). Traxion: a tactile interaction device with virtual force sensation. In ACM SIGGRAPH 2014 Emerging Technologies (p. 25). ACM.

12: Ross, P., & Keyson, D. V. (2007). The case of sculpting atmospheres: towards design principles for expressive tangible interaction in control of ambient systems. Personal and Ubiquitous Computing, 11(2), 69-79.

13: McMahan, W., & Kuchenbecker, K. J. (2014, February). Dynamic modeling and control of voice-coil actuators for high-fidelity display of haptic vibrations. In Haptics Symposium (HAPTICS), 2014 IEEE (pp. 115-122). IEEE.

14: Klooster, S., & Overbeeke, C. J. (2005). Designing products as an integral part of choreography of interaction: The product’s form as an integral part of movement. In Proc. of 1st European workshop on Design and Semantics of Form and Movement (pp. 23-35).

References

Page 25: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

25

Image listAll images in this report are made by me exept for:

1: Glass sphere on page 4 source: images.google.com/uosd9729

2: Framework illustation on page 4 by Angelini, L., Lalanne, D., Hoven, E. V. D., Khaled, O. A., & Mugellini, E. (2015). Move, Hold and Touch: A Framework for Tangible Gesture Interactive Systems. Machines, 3(3), 173-207.

3: Remote control on page 5 by Bang&Olufsen source: http://www.bang-olufsen.com/da

4: inFORM display by MIT Media Lab on page 5 source: http://tangible.media.mit.edu

5: Carrousel video snapshot on page 9 Carrousel demonstation video by Phillip Ross

Page 26: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

26

Appendix A Interview Quotesp1: the current state of the project lets us explore some things physically and emotionally instead of as ideas on paper. It helps us to understand the potential of the interaction para-digm and the haptic feelings associated with it.

P1:You have tried to put many different things into one pro-totype that resulted in some constraints like being confined to this little interaction area, this might have been less of a problem when you would have created 2 or more prototypes that divide the current functionality. But the interesting thing is that these constraints also inspire ways of looking at this that might not have occurred when the prototypes would have been separated.

P1: I think the result is really good, it provides a real hands-on experience to things that remain otherwise just ideas.

The mapping that you’ve tried to build up around the rotation as a playback metaphor is very interesting. The idea of using time immorality and momentum (ramping effect) is especially interesting. It would be interesting to give it to children to play with. to see what they would do , can they by slowly rotating their hand’s slow speech down or something. there is definitely something interesting there which is not explored yet.

P2 we were really excited to try it out because from your ex-planation it is hard to grasp the haptic sensation it provides. On the haptic level, it works really well in providing a convincing experience.

Comparing it with regular remote control?

p2: at this point it falls somewhere in between a pragmatic and a conceptual remote control.

p1: I think that at this point the limitation of holding it so close to the surface is so great that it only conceptually is pragmatic, the choice of the technology sort of binds the human motion a little. to such a degree that you at this point really have to think about this constraint. It is hard to imagine how this would be in real lif without having the constraint relative to the board.

P2: I do think you get a good sense of how it works even with the board constraint especially with the playback example.

can users by exploration find the functionality they are looking

for?P3: yes within a sort of comfortably lost paradigm

P1: I think they would be able to find the previous and next capabilities, maybe you could additionally use the picking the remote up and putting it back down as additional user action abilities to trigger haptic effects that encourage certain behaviour. The fun of the starting and stopping gesture is very good I’d wonder if you could further build on that.

p2: I was wondering if you could give some kind of feedforward where it would move slightly to signal the user what gesture to use.

P3: your project has a comparable dynamic to a gyro toy like a Powerball in that the torque of the ball signals the user how to further increase the speed of the inner ball.

p2: But that is quite a skill full motion with a steeper learning curve than your remote control.

P3: your application of haptics is unique in the sense that unlike most applications of haptics you do not try to replicate existing sensations like the feel of a mechanical switch but you created a paradigm that stands on its own. this makes it really fun to explore. I do think that your mechanism has some real world feasibility issues in their current implementation with regards to power draw and space needed for the mecha-nism.

P3: I think the audio ramping effect that you created togeth-er with the motor starting up really signal the connection between these modalities well.

how do you compare the two proposed interaction paradigms?

P2: for me the more embodied approach is much more inter-esting, it is a much more natural use of the shape of the ball. In the other one, it was really nice to feel the different textures when I browsed the menu items but I was wondering why am I holding a ball, it did not resonate with the shape.

P1: spacial navigation paradigm is much more shape agnostic and could just as well be applied to a wand shape for example.

P2: I like that you are in the first example really playing with what can I actually do with a ball shape. Why is the embodied

Page 27: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

27

interaction paradigm more aligned with the shape?P2:I think it takes the shape as a starting point and then says look what can we do with this. what feels natural. whereas with the other, it feels more random that you have that shape

p1: I think with the embodied approach you keep the ball as an object and there is an interaction between you and the ball but in the other example the ball tries to disappear because it is not conceptually necessary to be there. And it that case you can wonder if it is an ideal shape since you are basically pointing at things.

P2: for me the ball is very big in my hand so there needs to be a purpose for doing it, this is clear in the embodied approach but not so much in the other one.

What are the implications of embodiment on the concept of a remote control

P1: the embodied approach is less of a control and more of a partnership between the technology and you with this little fun thing in between.

P1:You can be part of the music by controlling it with this ramping effect. these are an interesting playful effect that you normally just don’t do with a remote control. It is not a control over the distance it is a cooperation between an intelligent system and you.

P2: if we take the starting point of this project there was this minimal element and the health and happiness direction we discussed. and this could really fit into a larger concept that works in that space.

P2: I think to justify having such an expensive haptic actuator in our remote control we have to come up with additional uses for it. this could be something playful or something meditative.

P1: I think that well working pragmatic controls that you’ve implemented in a unique way could be enough reason to do this, Everything does not need to be the same. You have to attract people that are interested in an alternative maybe more relaxed control scheme.

P1: It is just fun, not user centric in the traditional sense of the word. You have brought a fun proposition to the world and say let’s go from there.

P2: it is also a beautiful object for having in your living room especially compared to a more traditional remote control.

P1: your remote control covers all the basic like our essence remote but just does it in an interestingly different way.

General suggestions

P1: I think from this point on it would be interesting to explore both interaction paradigms separately without the ball as man-datory form factor. It would be interesting to take your ideas on embodiment and apply them to our pattern play and more music-specific ideas. For the spatial haptic navigation, I would like to see what other form factors and additional feedback modalities could benefit from this.

P1: So for the playful approach I still think a ball would really work but try to remove it from the base plate. You could design a nice charging station for it and create a relationship between that and the remote.

P1: it would also be interesting to map your rotational move-ment to other things than music and see which things put the biggest smiles on people’s faces.

P1: don’t’ forget that the size of the ball is going to have an impact on what are compelling control applications

P1: I think it could also be very interesting to couple your remote with voice control to make it context aware and work in an IOT context.

P1: I think that it is very interesting that the device can give you additional information about the music that is playing when you hold it that is not available when you lay it down.

P2: I think that the inherently private experience of haptics in a remote control could be an interesting way of sharing a mu-sic experience with more people by passing the remote around

P3: There is an interesting that the embodied approach feels more like a partnership with technology than a controller in much the same way interacting with a voice interface like Siri feels more like a partnership. I think that therefore these two technologies could really extend each other.

Page 28: Sphearical Interactions - Joep Elderman · line with a richer and more tangible user experience . Together we initiated the spherical interactions project Abstract An affordance is

28