building interactive prototypes of mobile user interfaces

10
Building Interactive Prototypes of Mobile User Interfaces with a Digital Pen Clemens Holzmann University of Applied Sciences Upper Austria Softwarepark 11, 4232 Hagenberg, Austria [email protected] Manuela Vogler University of Applied Sciences Upper Austria Softwarepark 11, 4232 Hagenberg, Austria [email protected] ABSTRACT Paper prototyping is commonly used to identify usability problems in the early stages of user interface design, but it is not very well suited for the evaluation of mobile interfaces. The reason is that mobile applications are used in a rich real- world context, which is hard to emulate with a paper pro- totype. A more powerful technique is to test the design on a mobile device, but building a functional design prototype requires much more effort. In this paper, we try to get the best of both worlds by building interactive prototypes with a digital pen. We developed a system which allows for sketch- ing a user interface on paper and manually associating the interface elements with functionality. This enables designers to bring their design ideas to paper without any restrictions, define the meaning of selected interface elements, and test them on a mobile device instantaneously. We conducted a user study in which the participants had to design and test a small application with our system. The results provide evi- dence for the feasibility and positive aspects of our approach, but also showed some limitations and missing functionalities of its current implementation. Author Keywords Mobile interfaces; paper prototyping; pen-based computing ACM Classification Keywords H.5.2. Information Interfaces and Presentation: User Inter- faces—Prototyping INTRODUCTION Paper prototyping is a technique for designing, testing and refining user interfaces, which is commonly used in the early stages of user interface design [12, 18]. A prototype consist- ing of hand-drawn sketches is presented to the user, who is asked to perform certain tasks with the prototype. According to Snyder [18], the most important benefits of pa- per prototyping are that it enables collecting user feedback before the actual implementation is started, that it facilitates rapid iterative development, and that it does not require any technical skills. Paper prototyping can be easily integrated Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. APCHI’12, August 28–31, 2012, Matsue-city, Shimane, Japan. Copyright 2012 ACM 978-1-4503-1496-1/12/08...$15.00. into the design process, because most designers prefer sketch- ing in early design stages [10, 12, 14]. The main reasons for the widespread creation of sketches are that they are quick to produce, that the use of paper and pencil is natural, and that the sketches avoid focusing on unimportant details [10]. In the domain of desktop system design, paper prototyping provides solid user feedback and allows to uncover many pos- sible problems. When comparing the results of a usability evaluation with a paper prototype and a high-fidelity proto- type, Virzi et al. [19] found little difference regarding the number of problems discovered with these prototypes. However, paper prototypes are not that adequate for the eval- uation of a mobile application design. Many usability prob- lems of mobile applications can be best discovered when evaluating a prototype in realistic settings, making it neces- sary to take the prototype out of the lab [7]. This can be difficult with a paper prototype due to its fragility, for ex- ample a prototype can be destroyed when put into the user’s pocket [6]. Additionally, paper prototypes are often not real- istic regarding the size of controls and the amount of informa- tion presented on a sketched screen, which can confuse and mislead users during the evaluation [7]. The best way to enable a realistic testing experience is to cre- ate a high-fidelity prototype that looks like a final product and that can be tested on the actual mobile device. Lumdsen and MacLean observed that using interactive prototypes on mobile devices enables users to identify more usability prob- lems [16]. Additionally, users are more satisfied when using mobile devices, because of the fact that they get real feed- back [9]. Summing up, paper prototypes have the advantages of low development costs and short production time [5], but they are not well suited for the evaluation of mobile applications be- cause of the unrealistic testing experience they provide [7]. In contrast, high-fidelity prototypes improve the testing ex- perience and the quality of the evaluation results [8, 16], but increase the costs and the time needed for creating them [5]. In this paper, we introduce a system which combines the ad- vantages of both low- and high-fidelity prototyping for the evaluation of mobile application designs, with the goal to im- prove the evaluation results without creating additional costs or effort. First, we present the system’s initial version that generated high-fidelity prototypes from hand-drawn sketches, and present our lessons learnt from this approach. We then discuss how the adaption of the system’s concept towards the building of interactive low-fidelity prototypes helped us 159

Upload: joana-pinto

Post on 01-Feb-2016

222 views

Category:

Documents


1 download

DESCRIPTION

Digital pen

TRANSCRIPT

Page 1: Building Interactive Prototypes of Mobile User Interfaces

Building Interactive Prototypes of Mobile User Interfaceswith a Digital Pen

Clemens HolzmannUniversity of Applied Sciences Upper Austria

Softwarepark 11, 4232 Hagenberg, [email protected]

Manuela VoglerUniversity of Applied Sciences Upper Austria

Softwarepark 11, 4232 Hagenberg, [email protected]

ABSTRACTPaper prototyping is commonly used to identify usabilityproblems in the early stages of user interface design, but itis not very well suited for the evaluation of mobile interfaces.The reason is that mobile applications are used in a rich real-world context, which is hard to emulate with a paper pro-totype. A more powerful technique is to test the design ona mobile device, but building a functional design prototyperequires much more effort. In this paper, we try to get thebest of both worlds by building interactive prototypes with adigital pen. We developed a system which allows for sketch-ing a user interface on paper and manually associating theinterface elements with functionality. This enables designersto bring their design ideas to paper without any restrictions,define the meaning of selected interface elements, and testthem on a mobile device instantaneously. We conducted auser study in which the participants had to design and test asmall application with our system. The results provide evi-dence for the feasibility and positive aspects of our approach,but also showed some limitations and missing functionalitiesof its current implementation.

Author KeywordsMobile interfaces; paper prototyping; pen-based computing

ACM Classification KeywordsH.5.2. Information Interfaces and Presentation: User Inter-faces—Prototyping

INTRODUCTIONPaper prototyping is a technique for designing, testing andrefining user interfaces, which is commonly used in the earlystages of user interface design [12, 18]. A prototype consist-ing of hand-drawn sketches is presented to the user, who isasked to perform certain tasks with the prototype.

According to Snyder [18], the most important benefits of pa-per prototyping are that it enables collecting user feedbackbefore the actual implementation is started, that it facilitatesrapid iterative development, and that it does not require anytechnical skills. Paper prototyping can be easily integrated

Permission to make digital or hard copies of all or part of this work forpersonal or classroom use is granted without fee provided that copies arenot made or distributed for profit or commercial advantage and that copiesbear this notice and the full citation on the first page. To copy otherwise, orrepublish, to post on servers or to redistribute to lists, requires prior specificpermission and/or a fee.APCHI’12, August 28–31, 2012, Matsue-city, Shimane, Japan.Copyright 2012 ACM 978-1-4503-1496-1/12/08...$15.00.

into the design process, because most designers prefer sketch-ing in early design stages [10, 12, 14]. The main reasons forthe widespread creation of sketches are that they are quick toproduce, that the use of paper and pencil is natural, and thatthe sketches avoid focusing on unimportant details [10].

In the domain of desktop system design, paper prototypingprovides solid user feedback and allows to uncover many pos-sible problems. When comparing the results of a usabilityevaluation with a paper prototype and a high-fidelity proto-type, Virzi et al. [19] found little difference regarding thenumber of problems discovered with these prototypes.

However, paper prototypes are not that adequate for the eval-uation of a mobile application design. Many usability prob-lems of mobile applications can be best discovered whenevaluating a prototype in realistic settings, making it neces-sary to take the prototype out of the lab [7]. This can bedifficult with a paper prototype due to its fragility, for ex-ample a prototype can be destroyed when put into the user’spocket [6]. Additionally, paper prototypes are often not real-istic regarding the size of controls and the amount of informa-tion presented on a sketched screen, which can confuse andmislead users during the evaluation [7].

The best way to enable a realistic testing experience is to cre-ate a high-fidelity prototype that looks like a final productand that can be tested on the actual mobile device. Lumdsenand MacLean observed that using interactive prototypes onmobile devices enables users to identify more usability prob-lems [16]. Additionally, users are more satisfied when usingmobile devices, because of the fact that they get real feed-back [9].

Summing up, paper prototypes have the advantages of lowdevelopment costs and short production time [5], but they arenot well suited for the evaluation of mobile applications be-cause of the unrealistic testing experience they provide [7].In contrast, high-fidelity prototypes improve the testing ex-perience and the quality of the evaluation results [8, 16], butincrease the costs and the time needed for creating them [5].

In this paper, we introduce a system which combines the ad-vantages of both low- and high-fidelity prototyping for theevaluation of mobile application designs, with the goal to im-prove the evaluation results without creating additional costsor effort. First, we present the system’s initial version thatgenerated high-fidelity prototypes from hand-drawn sketches,and present our lessons learnt from this approach. We thendiscuss how the adaption of the system’s concept towardsthe building of interactive low-fidelity prototypes helped us

159

Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Page 2: Building Interactive Prototypes of Mobile User Interfaces

to solve these problems, and describe the pen-based genera-tion of these prototypes in detail. We proceed with discussingthe results of a user study performed at our university to testthe general concept and the system’s current implementation.Finally, we survey and compare related work about sketch-based generation of interactive prototypes, and describe fu-ture research goals.

TOWARDS THE AUTOMATED GENERATION OF MOBILEAPPLICATION PROTOTYPESAs explained in the previous section, there is a dilemma in theearly stages of mobile application design. On the one hand,paper prototypes, which are fast and simple to create, do notprovide a sufficiently realistic testing experience. However,this is an important factor for identifying many usability prob-lems. On the other hand, high-fidelity prototypes are muchmore complex to create, and thus and slow down the designprocess.

Our first attempt to solve this dilemma, namely to enable bothshort iteration cycles and more realistic testing at the sametime, was to combine the advantages of low- and high-fidelityprototyping by developing a system that automatically gener-ates high-fidelity prototypes based on paper sketches of theuser interface. The goal was to end up with a tool whichallows fast and natural designing like in traditional paper pro-totyping, but also offers the possibility to use high-fidelityprototypes for testing without additional effort. We expectedthat the use of this system would increase the quality and us-ability of the resulting mobile applications, and that it wouldhelp to reduce the time-to-market and development costs.

In order to enable the automated generation of high-fidelityprototypes, the hand-drawn sketches have to be captured digi-tally before they can be processed by the system. For this pur-pose, we decided to use the Anoto digital pen and paper tech-nology [2]. The pen, which is equipped with an embeddedinfrared camera, takes snapshots of a special pattern printedon the paper, and uses this information to calculate its currentposition. By recording the position information, it is possibleto reconstruct the sketches.

To evaluate our idea, we implemented a first version of thesystem enabling the automated generation of high-fidelityprototypes based on sketches drawn with an Anoto pen. Fig-ure 1 shows the sketches which define the content of a sin-gle screen as well as its resulting representation in the high-fidelity prototype. As can be seen, a special template showingthe front side of a mobile phone was printed on the Anoto pa-per. The template had the same size as the mobile device, withthe goal to make both sketching and data processing easier.

During the sketching of a prototype, the system tried to rec-ognize the sketches as soon as they were drawn. For thispurpose, an application running on a mobile phone was re-sponsible for receiving the data captured by the Anoto penvia a Bluetooth connection and for forwarding the data toa server that executed the sketch recognition. As the onlysupported widgets in the first prototype were buttons, everystroke forming a rectangle was recognized as a button. Hand-writing within a button was transformed into text and was

Link

Figure 1. Two templates containing sketched user interface elementswith a defined screen change triggered by the “Back” button (left) andthe high-fidelity representation of the sketched screen on the left handside (right).

stored as the button’s caption. A line connecting a buttonand another screen template was interpreted as a definition ofscreen changing behavior, meaning that a click on the buttonshould lead to the displaying of the other screen.

As soon as the sketching was finished, the designer could trig-ger the generation of the according high-fidelity prototype.The system then used the stored sketch data to generate aset of HTML pages, where each page contained the contentdrawn into one of the sketching templates. During the genera-tion, the sketched button elements were converted into HTMLbuttons considering the positions and dimensions defined bythe sketches. Additionally, the HTML pages were linked toimplement the screen changing behavior. The generated pro-totype could then be tested using the mobile phone’s browser.

During the implementation and testing of the system, whichwas still in a quite premature stage, we came across severalproblems and shortcomings of our approach:

• There was no feedback for the designer during the sketch-ing process, so that the designer had to rely upon the cor-rect recognition of all sketches.

• The system was highly dependent on the correct recogni-tion of sketch types and handwritten texts, because therewas no mechanism for correcting wrong recognition re-sults.

• The system supported the definition of buttons, but wewere not sure how to sketch and process more complex anddynamic user interface elements like scrollbars or comboboxes.

These problems encouraged us to revise the system’s basicconcept in order to develop a tool that is easy and comfort-able to use for designers, works reliably and supports thedefinition of a wide range of commonly used user interfaceelements. The following section describes our new prototyp-ing approach, which has been developed based on the lessonslearnt from the first attempt.

PEN-BASED GENERATION OF INTERACTIVE LOW-FIDE-LITY PROTOTYPESMost of the problems listed in the previous section are relatedto the need for shape and handwriting recognition, which wasessential for transforming the sketches into interactive user

160

Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Page 3: Building Interactive Prototypes of Mobile User Interfaces

interfaces. Therefore, we assumed that a change in conceptreducing the need for sketch recognition would help to solvemost of these problems.

Adapted ConceptThere are several other projects generating digital prototypesfrom sketches which do not require sketch recognition (e.g.Paper-in-Screen [3] and ActiveStory [20]). The resulting pro-totypes look like the original sketches, but allow at least sim-ple interaction during the prototype evaluation. The sketchylook of these prototypes makes them well-suited for the earlydesign stages, because low-fidelity prototypes force users toconcentrate on general concepts like layout, terminology andnavigation [17]. In contrast, test participants often focus ondetails like fonts or colors when evaluating a high-fidelityprototype, but this kind of feedback is not that relevant inearly design stages. Because the generated prototypes areinteractive and can be tested on the targeted devices, theyenable a more realistic usage experience than simple paperprototypes, which is important for the evaluation of a mobileapplication design [7].

The concept of interactive low-fidelity prototypes seems tobe applicable to the system described in this paper, whichis mainly targeted at the early phases of mobile applicationdesign. Besides combining the gathering of more valuablefeedback due to the sketchy look of the prototypes and theenabling of interactive prototype testing on the target devices,this new approach also reduces the need for shape and hand-writing recognition, which was one of our most importantgoals.

In the system’s previous version, all the sketch data had to besent to a server for recognition, as the mobile platform didnot provide appropriate recognition libraries. The reducedneed for sketch recognition makes it possible to process thesketches directly on the mobile device as soon as they aredrawn. As a result, the displaying of immediate feedback forthe designer during the sketching becomes possible withouthaving to cope with potentially slow data transmission andlatency.

Another major conceptual change affected the technologyused for building the interactive prototypes. In the system’sfirst version, the prototypes consisted of a set of HTML pagesthat could be accessed using the mobile phone’s browser.However, this approach did not provide the most realistictesting experience, because the look and feel of websites isnot the same as that of a real application. Additionally, thebrowser’s behavior (e.g. appearing browser controls like theURL input field or the screen rotation behavior) could con-fuse the user. In order to make the handling of the proto-type more realistic, for example to allow for its start from thephone’s menu, the system was adapted to generate installableand executable applications representing the sketched user in-terfaces.

Supported WidgetsFor the implementation of our adapted concept for generatinglow-fidelity prototypes, we decided to use the Android plat-form. The main reasons for this decision were its huge market

share on the one hand, and the simple Bluetooth connectionwith the Anoto pen on the other hand. We analyzed the wid-gets provided by Android (see [1] for a list of all widgets) inorder to define which ones can be supported by our new sys-tem. However, please note that our approach is not limited toa certain platform, as other mobile operating systems providesimilar sets of user interface elements. In the following list,a classification of widgets is given, which provides the basisfor the implementation of interaction in our system:

• Buttons: Clickable elements that trigger screen changes(e.g. Button, ImageButton).

• Two-states buttons: Buttons that can either be checked orunchecked and use an icon to visualize their current state(e.g. CheckBox, RadioButton).

• Popup elements: Elements that trigger the displaying ofa content popup after being clicked (e.g. Spinner, which isthe Android equivalent to a combo box).

• Elements with simple predefined behavior: Elementswith simple type-dependent behavior that do not supportthe definition of further behavior (e.g. EditText, RadioBut-tonGroup, ScrollView).

• Elements with complex predefined behavior: Elementswith more complex appearance and predefined behavior(e.g. AnalogClock, DigitalClock, DatePicker, TimePicker).

• Non-interactive elements: Elements that just display in-formation, but do no enable interaction (e.g. TextView, Im-ageView).

• Elements without user-controlled behavior: Widgets wi-th behavior that can only be controlled by the applicationlogic, but not by direct user interaction (e.g. Toast, Pro-gressBar).

Defining Prototype BehaviorThe low-fidelity prototypes generated by the system intro-duced in this paper enable basic interaction like screen chan-ges, the use of combo boxes and the selection of options usingradio buttons or checkboxes. In order to make a sketched wid-get interactive, the designer has to manually define its typeand the associated behavior. Several interaction concepts,which have been developed for this purpose, are shown intable 1. Elements without user-controlled behavior are notsupported in the current version of our system, and remain anopen issue for future work.

For widgets that should not be interactive, none of the listedinteraction concepts has to be used during the sketching ofa prototype. The designer just has to write or draw the non-interactive content into a screen template, and the sketches arelater displayed in the generated prototype without allowingfor any kind of interaction.

All interactive widgets require at least one additional step af-ter sketching what they look like. For most of them, it isnecessary to assign type information using a concept similarto that of ActiveStory [20], namely to tag areas of a proto-type screen with type information. The left picture in figure 2

161

Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Page 4: Building Interactive Prototypes of Mobile User Interfaces

Type Behavior

Tool

box

assi

gnm

ent

Snip

pet u

se

Scre

enlin

king

Popu

psk

etch

ing

Snip

pete

mbe

ddin

g

Icon

defin

ition

Button x xTwo-states button x xPopup elements x x xElements with simplepredefined behavior

x

Elements with complexpredefined behavior

x

Non-interactive widgetsTable 1. Overview of used interaction concepts

shows the two steps that are necessary for defining a widgettype using the toolbox. First, the type has to be selected bytouching one of the toolbox elements printed on the paperwith the Anoto pen. The user can then define the area to tagwith this type information by drawing its bounding box witha single stroke. After the stroke has been finished, the mo-bile phone, which is responsible for processing the sketches,immediate displays feedback by changing the color of all in-cluded parts of the sketch (there are different colors for dif-ferent widget types). In the resulting prototype, the widgetlooks exactly like the sketched element.

1

2

Figure 2. Defining a widget’s type with the Anoto pen (left) results inimmediate feedback (middle). In the generated prototype, the widgetkeeps its original look (right).

Similar approaches for adding metadata to content written ordrawn with an Anoto pen have been used in several otherprojects. NiCEBook [4] for example, a paper notebook en-abling the digital capturing and organization of notes writ-ten with an Anoto pen, allows to assign a category to a noteby first ticking the according checkbox and then defining thenote’s area by selecting two corners of the region. Papier-Craft [13] uses special gestures to tag content with keywordsand select areas that should be sent via e-mail.

For most of the widget types, the designer can define spe-cific behavior after having assigned the type information. For

example, it is possible to define a screen change triggeredby a button click using the screen linking concept shown infigure 3. After touching the triggering button area with theAnoto pen, the user has to touch the template containing thetarget screen. Again, the mobile client application providesimmediate feedback: the left screen contains the triggeringbutton, and the right one is the resulting screen. If the de-signer wants to edit the screen change definition, this can bedone by first touching the triggering button element again andby touching another screen template afterwards. Addition-ally, it is also possible to delete a screen change by repeatingthe original definition steps.

1

2

Figure 3. The definition of a screen change triggered by a button click(top) is also visualized by the mobile client application (bottom).

Popup elements like combo boxes require sketching the con-tent of the popup on a special template that is also printedon Anoto paper. The size of this snippet can be adapted bycutting off unneeded space or by folding it, which enables tomake the popup just as big as necessary. To link the spinnerelement with its content popup, the designer first has to touchthe element with the Anoto pen, followed by the steps of thesnippet embedding concept described below. In the gener-ated prototype, a popup overlay shows the content after thespinner element is clicked.

To embed a snippet (i.e. a separate piece of paper) into ascreen template, the snippet has to be put onto the screen tem-plate first. As shown in figure 4, the designer has to drawtwo lines afterwards, which connect the corners of the snip-pet with the screen template. These two lines enable the de-

162

Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Joana Pinto
Realce
Page 5: Building Interactive Prototypes of Mobile User Interfaces

termination of both the snippet’s position and its size. Again,it is possible to remove an embedded snippet by repeatingthese steps. This interaction concept is similar to that used inNiCEBook [4] for the tagging of an entire page of the note-book with a specific topic through a dog-ear, which has to beregistered by stroking over the folded corner.

1

2

Figure 4. After embedding the popup snippet into the screen by drawingtwo lines (left), it is also visible in the mobile client application (middle).In the generated prototype, the popup is only displayed after clickingthe spinner element (right).

The snippet embedding concept is not only used for popupelements, but also for the integration of widgets with com-plex type-dependent behavior and appearance like clocks ordate and time pickers. In this case, the snippet consists of asketchy-looking print of the according widget on Anoto pa-per, which allows the designer to integrate complex interac-tive widgets without having to sketch them. The type of thewidget is implicitly given by choosing the right snippet anddoes not have to be assigned using the toolbar. In figure 5,a date-picker snippet and its representation in the generatedprototype is shown.

Figure 5. A date picker snippet (left) and its representation in the gener-ated prototype (right).

The last concept for defining behavior, the icon definition,is only used for two-states buttons (radio buttons and check-boxes) which consist of an icon indicating their current state.In the generated prototypes, the icons are supposed to changein order to visualize the current state (e.g. an x marks achecked checkbox). This makes it necessary for the system toknow which part of the sketched element represents the icon.

For this purpose, the stroke that was first drawn within thewidget area is assumed to be the icon after defining a two-states button using the toolbox, because most people mightstart to sketch a two-states button by drawing its icon. Thestroke recognized as the icon is highlighted by the mobileclient application using a thicker stroke. If the designer is notsatisfied with this selection, it is possible to change it by firsttouching the two-states button area and by then drawing theicon with a single stroke afterwards.

EVALUATION AND DISCUSSIONAlthough the prototyping system introduced in this paper isstill in an early implementation stage, we performed a firstuser study using a version of the system that only supported alimited set of the previously listed interactive widgets and in-teraction concepts. The study focused on the general feasibil-ity as well as the usability of our adapted concept. However,the intention was not to provide information about the useful-ness of the system compared to traditional paper prototyping,which will be investigated in a later development stage. Theobjectives of the study were

1. to check if the participants like the system’s general con-cept and feel comfortable when using the system,

2. to observe how the participants use the system for design-ing simple prototypes (e.g. design flow, use of interactivewidgets, . . . ),

3. to verify that the used interaction concepts are intuitive andeasy to learn,

4. to identify problems arising when using the system for areal design task, and

5. to find out which user interface elements besides the al-ready implemented ones are considered to be importantwhen designing a prototype.

ProcedureAs it was more important at this stage to eliminate simple us-ability issues before we evaluated a fuller system version withprofessional designers, we recruited 12 volunteers, 9 malesand 3 females, at the university campus. All of them werecomputer science students (10 undergraduate and 2 graduatestudents), and they were between the age of 20 and 26 years(M=22.5, SD=2.0). While all of them had developed mobileapplications before, only 6 of them had experience in user in-terface design and 4 participants in prototyping (one of themin paper prototyping). Only 4 of the participants used the An-oto technology before.

For each of the participants, the evaluation session took ap-proximately 30 minutes. After welcoming the participants,they were informed about the purpose of the study as wellas the basic idea of our prototyping system. They were thengiven a brief demonstration of how to use the system. Thisdemonstration included the handling of the mobile client ap-plication and the Anoto pen, the sketching of user interfaces,the definition of widgets types and associated behavior, aswell as the editing of previously defined information. In ev-ery evaluation session, the prototype which was created for

163

Page 6: Building Interactive Prototypes of Mobile User Interfaces

demonstration purpose was basically the same, and it con-tained the two interactive widget types supported by the sys-tem’s version used for the evaluation: buttons and spinnerelements. At the end of the demonstration, the generated pro-totype was shown to the participants to complete the overviewof the system’s features.

After the demonstration, each participant received a sheet ofpaper listing several requirements of a mobile task manage-ment application that should be designed using our system.The requirements only specified a very simple application andwere kept rough in order to allow for a creative design pro-cess. Basically, there were only two features the prototypehad to contain: (i) the displaying of an overview of all tasksand (ii) the possibility to add a new task. It was also definedthat it should be possible to specify at least a description, pri-ority, due date and category for every task entry. Additionally,participants were told that the prototype should consist of 2to 4 screens.

Afterwards, the participants started to design the user inter-face of the specified task management application. They weretold to talk about their thoughts during the sketching as wellas to ask questions. One researcher took notes about how theparticipants performed the tasks, the questions they asked andthe problems that occurred.

After finishing the task, users were asked to give feedbackabout the system and to name ideas for its extension and im-provement, with a focus on other interactive user interfaceelements which should be supported in the future. Finally,they were asked to complete the Post-Study System UsabilityQuestionnaire (PSSUQ) [11].

Results and TimingAll but one participants were able to create a prototype meet-ing the specified requirements. This participant created justan overview screen containing a single task entry which leadsto a detail page when clicked, but he did not design a possi-bility to add new tasks.

Figure 6 shows how much time it took the participants tocomplete the designs. It was between 143 seconds for thefastest and 1016 seconds for the slowest participant, with amedian value of 650 seconds. The fastest participant was alsothe one who did not meet the requirements.

0 200 400 600 800 1000 1200

Seconds

Figure 6. The distribution of time needed for creating the prototype.

7 participants used 2 screens for their prototype, the requiredtime was between 143 and 927 seconds with a median valueof 450 seconds, while 5 participants used 3 screens and need-ed a time between 519 and 1016 seconds and with a medianvalue of 679 seconds. None of the participants used fourscreens.

0 200 400 600 800 1000 1200

2 screens

3 screens

Seconds

Figure 7. The time needed for creating the prototype depending on thenumber of used screens.

User FeedbackAfter finishing the creation of their prototype, the participantswere asked to give feedback in an interview, focusing on ideasfor improving and extending the system. Generally, the par-ticipants recommended that the system should support all thecommon user interface elements that are also supported bytypical UI builders. Other ideas were to enable the selectionof different colors for the presentation of the sketches in thegenerated prototypes and to allow the definition of propertiesfor some widget types (e.g. the font size for a text field). Oneparticipant suggested to support different screen types likescreens containing a list or a gallery, and to let the designerdefine the type of the screen by using according sketchingtemplates.

As mentioned previously, it was one of the objectives of theuser study to find out which other interactive user interfaceelements are considered to be important by the test partici-pants and therefore should be supported by our system. Thefollowing widget types were named:

• Text fields (for text input)

• Checkboxes, radio buttons

• Images

• Date picker, calendar

• Lists with click-able elements

• Tab views

• Scroll areas

After the interview, the participants were asked tocomplete the Post-Study System Usability Questionnaire(PSSUQ) [11]. For the questionnaire, a 7 point Likert scale

164

Page 7: Building Interactive Prototypes of Mobile User Interfaces

from strongly disagree (1) to strongly agree (7) was used.Figure 8 shows a summary of the given answers.

1 2 3 4 5 6 7

Q1: Overall, I am satisfied with how easy it is to use this system

Q2: It was simple to use this system

Q3: I could effectively complete the tasks and scenarios using this system

Q4: I was able to complete the tasks and scenarios quickly using this system

Q5: I was able to efficiently complete the tasks and scenarios using this system

Q6: I felt comfortable using this system

Q7: It was easy to learn to use this system

Q8: I believe I could become productive quickly using this system

Q9: The system gave error messages that clearly told me how to fix problems

Q10: Whenever I made a mistake using the system, I could recover easily and quickly

Q11: The information (such as on-line help, on-screen messages, and other documentation) provided with this system was clear

Q12: It was easy to find the information I needed

Q13: The information provided for the system was easy to understand

Q14: The information was effective in helping me complete the tasks and scenarios

Q15: The organization of information on the system screens was clear

Q16: The interface of this system was pleasant

Q17: I liked using the interface of this system

Q18: This system has all the functions and capabilities I expect it to have

Q19: Overall, I am satisfied with this system

Strongly disagree

Strongly agreeNeither agree nor disagree

M = 6.33, SD = 0.62

M = 6.25, SD = 0.60

M = 6.42, SD = 0.60

M = 6.27, SD = 1.29

M = 6.09, SD = 0.79

M = 6.58, SD = 0.86

M = 6.92, SD = 0.28

M = 6.75, SD = 0.43

M = 4.00, SD = 2.09

M = 6.00, SD = 1.04

M = 5.50, SD = 1.50

M = 5.60, SD = 1.43

M = 6.55, SD = 0.66

M = 6.55, SD = 0.78

M = 6.17, SD = 1.34

M = 5.92, SD = 1.26

M = 6.42, SD = 0.95

M = 4.25, SD = 1.48

M = 6.08, SD = 0.64

Figure 8. Average user answers to the questions of the PSSUQ question-naire.

The questionnaire helped us to get the information and feed-back we wanted to gather from the study. First of all, itshowed that the participants liked to use the system and feltcomfortable while using it. In general, the participants weresatisfied with the system and also with how easy it was to use.Therefore, they felt able to effectively and quickly completetheir task using the prototyping system. Most participants

agreed to Q6 (”I felt comfortable using this system”), justone neither agreed nor disagreed. 10 participants found theinterface of the system pleasant (Q16), and 11 participants –except one, who neither agreed nor disagreed – liked using it(Q17).

The questionnaire also verified that the use of the system iseasy to learn and that the supported interaction concepts areintuitive. All participants agreed that it was easy to learn touse the system (Q7) and that it was simple to use (Q2). Theparticipants also strongly agreed that they believe to be ableto become productive quickly using our system (Q8).

Some aspects of the system, especially the handling of errors,were not rated that well. On average, the participants neitheragreed nor disagreed to Q9 (”The system gave error messagesthat clearly told me how to fix problems”) and some of themalso stated not to feel able to recover easily and quickly aftera mistake (Q10). We also noticed that during the sketching ofthe prototypes, as we often had to intervene and tell the par-ticipants how to proceed in cases of problems and mistakes.Of course, it is essential to improve the error handling capa-bilities in following system versions.

Another aspect which has been criticized by the participantswas the range of provided functions and capabilities. The an-swers to Q18 (”This system has all the functions and capabil-ities I expect it to have”) emphasize the need for supportingmore interactive user interface elements than those listed be-fore on the one hand, and for extending the system’s featureset on the other hand. Only 5 users agreed to this question,while 3 neither agreed nor disagreed and 4 disagreed.

Problems EncounteredOne of the objectives of our study was to identify problemsoccurring when performing a realistic task with the prototypesystem. Some of the observed problems were related to theinaccurate printing of the sketching templates. Unfortunately,we did not notice that the printer used for printing the Anotopattern worked not accurately enough due to its paper feed,so that the Anoto pattern was positioned lopsidedly on severalpages. When printing the sketching templates on these pages,this resulted in differences between the template’s positionon the Anoto pattern and the position defined in the system.As the positioning offset was not the same for all templates, itwas not possible to correct it by simply redefining the positionin the system. In the future, it will be necessary to pay moreattention to the exact printing of the templates to avoid theoffset and the subsequently listed resulting problems:

• There was a distracting offset between the representationof the sketches on the paper and in the feedback displayedby the mobile client application, which confused some ofthe test participants.

• Participants were not able to define the content popups forspinner elements, because the system was not able to cor-rectly handle the snippet embedding interaction concept.Because of the positioning offset, the system could not rec-ognize when the Anoto pen was moved into the screen tem-plate during the drawing of the two lines connecting the

165

Page 8: Building Interactive Prototypes of Mobile User Interfaces

snippet and the underlying screen, and therefore cancelledthe processing of the captured data.

Other problems that occurred during the sketching were caus-ed by the system’s sketch processing capabilities:

• The tested version of the system regularly crashed whenusers sketched fast. Although we asked the study partici-pants to sketch slowly, 5 of them were not able to reducetheir sketching speed and therefore caused system crashes.As users should not have to change their drawing style andspeed in order to use our system, we definitely have to im-prove the implementation of our system to make it also ro-bust during faster sketching.

• Due to the strictly defined position and size of the sketchingtemplate, there were problems when participants sketchedclose to the borders of the template. Althought the tip ofthe Anoto pen was still inside the defined drawing area, thecamera already captured positions outside the area due toa little offset between tip and camera. To solve this, thesystem’s flexibility regarding sketches outside the drawingarea has to be increased by e.g. making the defined area alittle bit bigger than it is.

Some of the test participants were interrupted while sketchingtheir prototype because of technical problems, for example:

• Crashes of the wireless network that made it impossible totransmit the captured sketches to the server

• Crashes of the Bluetooth connection between Anoto penand mobile client application

• Communication problems between Anoto pen and mobileclient application which made it impossible for the systemto capture the sketches

For the further implementation of the system, it will be im-portant to improve the monitoring of the connections to theserver and the pen, so that users can be better informed aboutoccurring communication problems and can be told how toproceed.

DiscussionThe early user study described in this section gave us a lot ofinformation and feedback that will be of value for the furtherimprovement of our system. During the sketching sessions,we could observe that users really enjoyed using the tool andthat they were impressed when testing the generated inter-active prototypes. This observation is also confirmed by theresults of the post-study questionnaire, where the participantsstated that they were satisfied with the system and liked usingits interface.

The participants quickly learned how to use the tool and wereable to create sophisticated interactive prototypes despite thelimited range of supported widget types. Some of the partici-pants got really creative regarding the usage of the supporteduser interface elements for the building of more complex ele-ments, for example they created task lists containing clickableelements by defining the list entries as buttons. By assigningscreen change information to these elements, they were able

to implement a detail-screen for selected tasks. One of theparticipants also tried to create a date picker control using aspinner element and a linked popup.

When observing how the participants used the prototypingtool for their design task, we identified several problems re-garding the currently used interaction concepts:

• The drawing of a bounding box for defining the area of aninteractive widget might not be optimal, because it requiresthe adding of a stroke that does not belong to the actualdesign. This makes the resulting screen sketches messy,especially in cases where the type definition fails and hasto be repeated. Therefore, it might be necessary to findanother concept for defining the area to tag with type in-formation, for example by just drawing the top left and thebottom right corner of the bounding box as proposed in [4]for assigning categories to notes. One of the participantsalso suggested using the plastic pen tip instead of the inkcartridge for defining widget types.

• The interaction concept for embedding a snippet into ascreen used for defining the popup of a spinner elementappeared to be not as intuitive as the screen linking con-cept. We observed that several participants tried to applythe screen linking approach to the popup definition, so thatthey simply clicked the popup and the containing screen in-stead of drawing the necessary connection lines. One par-ticipant had the idea to make the embedding of snippetseasier by letting the user set the position and size of thepopup within the screen template by defining its boundingbox (e.g. by drawing two corners of the bounding box), andthen to link the area and the popup by clicking the popupand the defined area, which is similar to the concept fordefining screen changes.

The study revealed two main aspects of the system that haveto be improved to make the users more satisfied: (i) the set ofsupported widget types and (ii) the error handling. Althoughsome participants were able to use the supported widgets forthe building of more complex elements, they still stated thatother common user interface elements like checkboxes andtext fields are also important for their designs. The secondpoint, the currently inappropriate error handling, seemed toaffect the user satisfaction more negatively than the limitedset of features.

RELATED WORKVarious research projects deal with the usage of sketches asa basis for the generation of interactive prototypes. Table 2provides an overview of the projects that are subsequentlyexplained in more detail.

Some of these tools create sketchy-looking interactive pro-totypes like our system. A simple way for achieving this isto digitize paper prototypes, e.g. by taking photos. Paper-in-Screen [3] uses this concept to display application prototypeson mobile devices, but has the disadvantage that the user isjust able to flip through the images. The pseudo-paper pro-totypes proposed by Lumsden and MacLean [16] use a simi-lar approach, but support the definition of clickable areas en-abling basic interaction like button clicks.

166

Page 9: Building Interactive Prototypes of Mobile User Interfaces

Fidelity InteractivityPaper-in-Screen [3] Low Flip through

imagesPseudo-paper [16] Low Clickable areasActiveStory [20] Low Clickable areasDe Sa et al. [9] Low, mid

and highFully interactive

SILK [10] High Fully interactiveDENIM [15] Low Fully interactive

Table 2. Overview of sketch-based prototyping tools

ActiveStory’s [20] concept is similar to that of our system. Itdoes not only support input in form of images, but also allowsfor the pen-based generation of interactive low-fidelity proto-types as well as the definition of clickable areas. In contrast toour system, these clickable areas cannot be tagged with typeinformation, but can only be used to trigger screen changes.ActiveStory also differs from our system because it collectsdata during the evaluation of a prototype, for example mousetrails, page durations and comments entered by the partici-pants.

De Sa et al. [9] developed a software framework allowingthe creation of low-, mid- and high-fidelity prototypes thatcan be evaluated on mobile devices. Scanned images can beused for the low-fidelity prototypes and high-fidelity proto-types can be built by selecting pre-configured user interfaceelements, but the system does not allow for pen-based input.Another difference to the concept of our system is that theframework encourages test users to actively participate in thedesign process, because they can edit the prototype during theevaluation. For example, it is possible to change the locationand size of elements, to delete screens and components, andto rearrange the screen sequence.

In contrast to our system, SILK [10] uses shape recognitionto convert a sketched prototype into a functional interface thatcan be reused in later stages of the design cycle. To avoidthe problems caused by sketch recognition our system had tocope with, SILK uses a trainable recognizer, provides feed-back about the recognition results and allows the correctingof recognition errors.

DENIM, a tool for the sketch-based design of web sites us-ing digitizing tablets, enables to test designs in their origi-nal sketchy look [15]. The implemented visual language al-lows the definition and usage of components for reusable userinterface elements. In contrast to our system’s widget typeconcept, DENIM distinguishes between intrinsic componentsbuilt into the visual language like text fields and buttons andcustom components that can be freely designed. The proto-type behavior that can be defined with the visual languagegoes beyond that of our system, because it supports differentevents for triggering screen transitions (e.g. left mouse clicks,double clicks and timeouts) and the definition of conditionalbehavior.

CONCLUSIONS AND FUTURE WORKRealistic prototypes, which can be tested on the targeted de-vice, are considered to improve the quality of prototyping re-

sults in the domain of mobile application development andshould thus be preferred over simple paper prototypes. Theproblem about using interactive prototypes is that the effortto create them is much higher than the effort for creating low-fidelity prototypes, which can quickly be sketched on paper.This makes interactive prototyping inadequate for the earlyphases of the design process, where short iteration cycles arecrucial.

This paper presents our research aiming for combining theadvantages of paper prototypes and interactive prototypes byusing an Anoto pen for sketching a user interface and by au-tomatically generating interactive prototypes based on thesesketches. It describes the first version of the system responsi-ble for the processing of the sketched prototypes, which gen-erated HTML-based high-fidelity prototypes. Due to severalproblems and limitations mainly caused by the need for reli-able shape and handwriting recognition, it was necessary tochange the system’s concept towards the generation of inter-active low-fidelity prototypes.

Our adapted concept of generating interactive low-fidelityprototypes instead of high-fidelity ones is supported by thework of other researchers, who found out that sketchy lookingprototypes are better suited for the evaluation of a user inter-face in the early design stages. The reason is that they enabletest participants to focus on the general concepts like contentand navigation instead of details like colors. Although theprototypes look rough, they can be tested on mobile devicesto provide a realistic testing experience.

A first user study showed that novice users are able to quicklylearn how to use our prototyping system, and that they likethe system’s approach. Most participants were able to com-plete a given design task and successfully created interactiveprototypes, even though an early version of the system pro-viding only a limited set of features was used. The study alsorevealed several system aspects that have to be improved toincrease user satisfaction. The most important ones are errorhandling and the provision of clear error messages as well asthe support of more interactive user interface elements.

Besides improving the system according to the feedback col-lected during the user study, it is also planned to implementsupport for collaborative design, because user interface de-sign is usually done by a team of designers. This supportcould include several features like the maintenance of differ-ent prototype versions or the possibility to give feedback to acolleague’s draft. Multimodal input and output could be use-ful to facilitate collaborative design, like for example audiorecordings which could be used to collect feedback. Anotherinteresting feature would be to generate an overview pictureshowing the prototype’s navigation structure (e.g. which but-ton leads to the displaying of which screen). This could helpdesigners to review the navigation in design meetings and toexplain the application’s basic concept to customers.

Summing up, there are various possible improvements andextensions for the prototyping system presented in this pa-per. What has to be kept in mind during the further imple-mentation is that the system has to remain simple to use and

167

Page 10: Building Interactive Prototypes of Mobile User Interfaces

should not distract designers from sketching, as it is of ut-most importance not to hinder their creativity. For a later de-velopment stage, it is planned to perform another user studywith designers, in order to evaluate the system’s usefulnessfor professionals and for comparing its efficiency with tradi-tional prototyping approaches.

AcknowledgementsThe research presented is conducted within the Austrian pro-ject “AIR – Advanced Interface Research” funded by the Aus-trian Research Promotion Agency (FFG), the ZIT Center forInnovation and Technology and the province of Salzburg un-der contract number 825345.

REFERENCES1. android.widget — Android Developers, 2011.

http://developer.android.com/reference/android/widget/package-summary.html (Last retrieved October20, 2011).

2. Anoto Group, 2011. http://www.anoto.com/ (Lastretrieved September 29, 2011).

3. Bolchini, D., Pulido, D., and Faiola, A. FEATURE:”Paper in screen” prototyping: an agile technique toanticipate the mobile experience. interactions 16 (July2009), 29–33.

4. Brandl, P., Richter, C., and Haller, M. NiCEBook:supporting natural note taking. In Proceedings of the28th international conference on Human factors incomputing systems, ACM (2010), 599–608.

5. Coyette, A., and Vanderdonckt, J. A sketching tool fordesigning anyuser, anyplatform, anywhere userinterfaces. Human-Computer Interaction-INTERACT2005 (2005), 550–564.

6. de Sa, M., and Carrico, L. Low-fi prototyping for mobiledevices. In CHI’06 extended abstracts on Human factorsin computing systems, ACM (2006), 694–699.

7. de Sa, M., and Carrico, L. Lessons from early stagesdesign of mobile applications. In Proceedings of the10th international conference on Human computerinteraction with mobile devices and services, ACM(2008), 127–136.

8. de Sa, M., and Carrico, L. A mobile tool for in-situprototyping. In Proceedings of the 11th InternationalConference on Human-Computer Interaction withMobile Devices and Services, ACM (2009), 1–4.

9. de Sa, M., Carrico, L., Duarte, L., and Reis, T. Amixed-fidelity prototyping tool for mobile devices. InProceedings of the working conference on Advancedvisual interfaces, ACM (2008), 225–232.

10. Landay, J. Interactive Sketching for the Early Stages ofUser Interface Design. PhD thesis, Carnegie MellonUniversity, 1996.

11. Lewis, J. IBM computer usability satisfactionquestionnaires: psychometric evaluation and instructionsfor use. International Journal of Human-ComputerInteraction 7, 1 (1995), 57–78.

12. Li, Y., Cao, X., Everitt, K., Dixon, M., and Landay, J.FrameWire: a tool for automatically extractinginteraction logic from paper prototyping tests. InProceedings of the 28th international conference onHuman factors in computing systems, ACM (2010),503–512.

13. Liao, C., Guimbretiere, F., Hinckley, K., and Hollan, J.Papiercraft: A gesture-based command system forinteractive paper. ACM Transactions onComputer-Human Interaction (TOCHI) 14, 4 (2008),1–27.

14. Lin, J., Newman, M., Hong, J., and Landay, J. DENIM:finding a tighter fit between tools and practice for Website design. In Proceedings of the SIGCHI conference onHuman factors in computing systems, ACM (2000),510–517.

15. Lin, J., Thomsen, M., and Landay, J. A visual languagefor sketching large and complex interactive designs. InProceedings of the SIGCHI conference on Humanfactors in computing systems: Changing our world,changing ourselves, ACM (2002), 307–314.

16. Lumsden, J., and MacLean, R. A Comparison ofPseudo-Paper and Paper Prototyping Methods forMobile Evaluations. In The International Workshop onMObile and NEtworking Technologies for socialapplications (MONET’2008), part of the LNCSOnTheMove (OTM) Federated Conferences andWorkshops (2010).

17. Rettig, M. Prototyping for tiny fingers. Communicationsof the ACM 37, 4 (1994), 21–27.

18. Snyder, C. Paper prototyping: The fast and easy way todesign and refine user interfaces. Morgan KaufmannPub, 2003.

19. Virzi, R., Sokolov, J., and Karis, D. Usability problemidentification using both low-and high-fidelityprototypes. In Proceedings of the SIGCHI conference onHuman factors in computing systems: common ground,ACM (1996), 236–243.

20. Wilson, P. Active Story: A Low Fidelity Prototyping andDistributed Usability Testing Tool for Agile Teams.Univerity of Calgary, MSc Thesis August (2008).

168