vinícius costa villas bôas segura uiskei: sketching the ...simone/files/vsegura2011.pdfand widgets...

95
Vinícius Costa Villas Bôas Segura UISKEI: Sketching the User Interface and Its Behavior DISSERTAÇÃO DE MESTRADO Dissertation presented to the Postgraduate Program in Informatics of the Departamento de Informática, PUC-Rio as partial fulfillment of the requirements for the degree of Mestre em Informática. Advisor: Simone Diniz Junqueira Barbosa Rio de Janeiro March 2011

Upload: others

Post on 27-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

Vinícius Costa Villas Bôas Segura

UISKEI: Sketching the User Interface and Its Behavior

DISSERTAÇÃO DE MESTRADO

Dissertation presented to the Postgraduate Program in Informatics of the Departamento de Informática, PUC-Rio as partial fulfillment of the requirements for the degree of Mestre em Informática.

Advisor: Simone Diniz Junqueira Barbosa

Rio de Janeiro March 2011

Page 2: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

Vinícius Costa Villas Bôas Segura

UISKEI: Sketching the User Interface and Its Behavior

Dissertation presented to the Postgraduate Program in Informatics of the Departamento de Informática do Centro Técnico Científico da PUC-Rio, as partial fulfillment of the requirements for the degree of Mestre.

Profa. Simone Diniz Junqueira Barbosa Advisor

Departamento de Informática - PUC-Rio

Prof. Hugo Fuks Departamento de Informática - PUC-Rio

Prof. Alberto Barbosa Raposo Departamento de Informática - PUC-Rio

Prof. José Eugênio Leal Coordinator of the Centro Técnico Científico - PUC-Rio

Rio de Janeiro, March 30, 2011

Page 3: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

All rights reserved.

Vinícius Costa Villas Bôas Segura

Graduated in Computer Engineering from PUC-Rio in 2008, he works at Tecgraf, a Computer Graphics laboratory from PUC-Rio's. His main focus areas are Human-Computer Interaction and Computer Graphics.

Bibliographic data

CDD: 004

Segura, Vinícius Costa Villas Bôas

UISKEI: sketching the user interface and its behavior /

Vinícius Costa Villas Bôas Segura ; advisor: Simone Diniz

Junqueira Barbosa. – 2011.

95 f. : il. (color.) ; 30 cm

Dissertação (mestrado)–Pontifícia Universidade Católica do

Rio de Janeiro, Departamento de Informática, 2011.

Inclui bibliografia

1. Informática – Teses. 2. Interação baseada em caneta. 3.

Esboços de interface. 4. Prototipação no estágio inicial. 5.

Desenho do comportamento da interface. I. Barbosa, Simone

Diniz Junqueira. II. Pontifícia Universidade Católica do Rio de

Janeiro. Departamento de Informática. III. Título.

Page 4: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

To my grandma, one of my first teachers.

Page 5: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

Acknowledgments

To CNPq, for funding my research.

To Simone Barbosa, the best advisor a student could ever ask for. Always present

and helpful, knowing how to motivate through curiosity and provide guidance

with kindness, instead of adopting the typical "Prof. Smith attitude".

To my family, for the continued support and for keeping up the positive thinking

even from afar.

To Clarissa, who may have spent some time away, but is now closer than ever,

playing a big role in my life. Thanks for all the encouragement and motivation,

you give me the strength to power through.

To PUC people - specially Laris, Jan, Baère and Paula - who accompanied me

through the worries of the Masters program and still found time for movie night

almost every Friday.

To Tec people, the ones responsible for the soundboard that I now have in my

mind, for raising my glucose to dangerous levels with super-sized candies and for

all the fun both inside and outside our work place.

To pH people - specially Nat, Fê, Mari and Pedro - for all the good times and the

support during this phase.

To my childhood friends, Ray and Bia, for still being there no matter what.

Page 6: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

Abstract

Segura, Vinícius Costa Villas Bôas; Barbosa, Simone Diniz Junqueira.

UISKEI: Sketching the User Interface and Its Behavior. Rio de Janeiro,

2011. 95p. MSc. Dissertation– Departamento de Informática, Pontifícia

Universidade Católica do Rio de Janeiro.

During the early user interface design phase, different solutions should be

explored and iteratively refined by the design team. In this rapidly evolving

scenario, a tool that enables and facilitates changes is of great value. UISKEI

takes the power of sketching, allowing the designer to convey his or her idea in a

rough and more natural form of expression, and adds the power of computing,

which makes manipulation and editing easier. More than an interface prototype

drawing tool, UISKEI also features the definition of the prototype behavior, going

beyond navigation between user interface containers (e.g. windows, web pages,

screen shots) and allowing to define changes to the state of user interface elements

and widgets (enabling/disabling widgets, for example).

This dissertation presents the main concepts underlying UISKEI and a study

on how it compares to similar tools. The user interface drawing stage is detailed,

explaining how the conversion of sketches to widgets is made by combining a

sketch recognizer, which uses the Levenshtein distance as a similarity measure,

and the interpretation of recognized sketches based on an evolution tree.

Furthermore, it discusses the different solutions explored to address the issue of

drawing an interaction, suggesting an innovative mind-map-like visualization

approach that enables the user to express the event, conditions and actions of each

interaction case while still keeping the pen-based interaction paradigm in mind.

Keywords

Pen-based interaction; user interface sketching; ink; early prototyping;

sketching user interface behavior

Page 7: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

Resumo

Segura, Vinícius Costa Villas Bôas; Barbosa, Simone Diniz Junqueira.

UISKEI: Sketching the User Interface and Its Behavior. Rio de Janeiro,

2011. 95p. Dissertação de Mestrado – Departamento de Informática,

Pontifícia Universidade Católica do Rio de Janeiro.

Durante o estágio inicial do design de uma interface com usuário, diferentes

soluções devem ser exploradas e refinadas iterativamente pela equipe de design.

Nesse cenário de mudanças rápidas e constantes, uma ferramenta que permita e

facilite essas mudanças é de grande valia. UISKEI explora o poder do desenho,

possibilitando ao designer transmitir sua ideia com uma forma de expressão mais

natural, e adiciona o poder computacional, facilitando a manipulação e edição dos

elementos. Mais do que uma ferramenta de desenho de protótipos de interface,

UISKEI também permite a definição do comportamento da interface, indo além

da navegação entre contêineres de interfaces (por exemplo, janelas, páginas web,

capturas de telas) e possibilitando definir mudanças nos estados dos elementos de

interface (habilitando e desabilitando-os, por exemplo).

Essa dissertação apresenta os conceitos principais por trás do UISKEI e um

estudo de como ele se compara a ferramentas similares. A etapa de desenho da

interface é detalhada, explicando como a conversão dos traços em widgets é feita

através da combinação de um reconhecedor de traços, que usa a distância de

Levenshtein como medida de similaridade, e a interpretação dos traços

reconhecidos baseada em uma árvore de evoluções. Além disso, também são

discutidas as diferentes soluções exploradas para endereçar o problema do

desenho da interação, propondo uma visualização inovadora no estilo mind-map

que possibilita ao usuário expressar o evento, as condições e ações de cada caso

de interação, sem abandonar o paradigma da interação com caneta.

Palavras-chave

Interação baseada em caneta; esboços de interface; prototipação no estágio

inicial; desenho do comportamento da interface

Page 8: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

Contents

1 Introduction __________________________________________ 14

2 Related work _________________________________________ 16

2.1 Mouse-based prototyping software ____________________ 16

2.1.1. Microsoft Visio ______________________________ 16

2.1.2. Balsamiq ___________________________________ 17

2.1.3. Axure RP Pro _______________________________ 18

2.2 Pen-based prototyping software ______________________ 19

2.2.1. DENIM ____________________________________ 19

2.2.2. SketchiXML ________________________________ 21

2.2.3. CogTool ___________________________________ 23

2.3 Summary ________________________________________ 25

3 UISKEI ______________________________________________ 27

3.1 Early version study ________________________________ 30

3.2 New version requirements ___________________________ 31

4 Drawing the interface ___________________________________ 32

4.1 Segments _______________________________________ 32

4.2 Shapes _________________________________________ 34

4.3 Element descriptors ________________________________ 35

4.4 Recognition ______________________________________ 37

4.5 Element properties ________________________________ 38

5 Drawing the behavior ___________________________________ 41

5.1 ECA ____________________________________________ 41

5.1.1. Events (When) ______________________________ 42

5.1.2. Conditions (If) _______________________________ 42

5.1.3. Actions (Do) ________________________________ 43

5.1.4. Valid ECAs _________________________________ 44

5.2 Drawing ECAs ____________________________________ 44

Page 9: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

6 Prototype evaluation ___________________________________ 50

7 Recognizer test _______________________________________ 53

8 User evaluation study __________________________________ 59

8.1 Evaluation method _________________________________ 60

8.2 Evaluation results _________________________________ 62

8.3 Participants’ opinions ______________________________ 65

9 Conclusion and Future work _____________________________ 67

9.1 Shapes and element descriptors ______________________ 68

9.2 Recognizer ______________________________________ 69

9.3 ECA ____________________________________________ 69

9.4 Prototype evaluation _______________________________ 70

10 Bibliography __________________________________________ 71

11 Appendix A: Implementing UISKEI ________________________ 75

11.1 uskModel ________________________________________ 75

11.2 uskRecognizer ____________________________________ 78

11.3 uskWizard _______________________________________ 80

12 Appendix B: Evaluation study script ________________________ 84

12.1 Description ______________________________________ 84

12.2 Questionnaire ____________________________________ 85

12.3 First Cycle _______________________________________ 87

12.3.1. Scenario 01 _______________________________ 87

12.3.2. Task 01 __________________________________ 87

12.4 Second Cycle ____________________________________ 88

12.4.1. Scenario 02 _______________________________ 88

12.4.2. Task 02 __________________________________ 88

12.5 Third Cycle ______________________________________ 89

12.5.1. Scenario 03 _______________________________ 89

12.5.2. Task 03 __________________________________ 90

13 Appendix B: Complete test results _________________________ 91

Page 10: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

List of Figures

Figure 1: Microsoft Visio 2007. _______________________________ 17

Figure 2: Balsamiq. ________________________________________ 18

Figure 3: Axure RP Pro. ____________________________________ 18

Figure 4: Defining behavior with Axure RP Pro. __________________ 19

Figure 5: DENIM. _________________________________________ 20

Figure 6: DENIM gesture system. _____________________________ 20

Figure 7: DENIM’s representation of conditionals. ________________ 21

Figure 8: SketchiXML. _____________________________________ 22

Figure 9: Different levels of representation in SketchiXML. _________ 23

Figure 10: CogTool. ________________________________________ 24

Figure 11: Simulation in CogTool. _____________________________ 24

Figure 12: A simple interaction design model. ____________________ 27

Figure 13: Action Manager of UISKEI’s early version. ______________ 30

Figure 14: Drawing a rectangle in three different ways. _____________ 32

Figure 15: Multiple strokes that should be grouped but not merged. ___ 33

Figure 16: The directions compass rose. ________________________ 34

Figure 17: The rectangle shape as a string. ______________________ 34

Figure 18: Language to create elements. ________________________ 36

Figure 19: Douglas Peucker algorithm __________________________ 37

Figure 20: Labels. __________________________________________ 39

Figure 21: ECAMan interface. ________________________________ 42

Figure 22: ECA buttons. _____________________________________ 45

Figure 23: Pie menu allows a single stroke to define all parameters. ___ 46

Figure 24: Element condition (left) and action (right) pie menus. ______ 47

Figure 25: Presentation unit filmstrip. ___________________________ 47

Figure 26: DENIM combinatorial explosion. ______________________ 48

Figure 27: ECA mind-map-like representation. ____________________ 49

Figure 28: Prototype evaluation example. _______________________ 51

Figure 29: The evolution of login screens through the cycles. ________ 59

Page 11: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

Figure 30: Average scores per question. ________________________ 62

Figure 31: Average score per group of questions. _________________ 64

Figure 32: Summary of interview results. ________________________ 64

Figure 33: Class diagram for uskModel. _________________________ 75

Figure 34: ElementDescriptor class. ____________________________ 77

Figure 35: Operations enums. ________________________________ 78

Figure 36: Class diagram for uskRecognizer. _____________________ 79

Figure 37: Class diagram for uskWizard. ________________________ 81

Page 12: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

List of Tables

Table 1: Software comparison. ______________________________ 25

Table 2: Elements’ states. __________________________________ 40

Table 3: Condition and action creation. ________________________ 45

Table 4: Top 10 recognizer configuration results organized by

success percentage. _______________________________ 55

Table 5: Top 10 recognizer configuration results organized by

average shape success percentage. __________________ 56

Table 6: Shape analysis for best success percentage configuration. _ 57

Table 7: Shape analysis for best average success percentage

configuration. ____________________________________ 58

Table 8: UISKEI comparison with related software. ______________ 68

Table 9: Tools presentation order. ____________________________ 91

Table 10: Questionnaire answers. _____________________________ 91

Table 11: First cycle questionnaire answers and statistics. __________ 92

Table 12: Second cycle questionnaire answers and statistics. _______ 93

Table 13: Third cycle questionnaire answers and statistics. _________ 94

Page 13: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

E como as esperanças têm esse fado que cumprir, nascer uma

das outras, por isso é que, apesar de tantas decepções, ainda

não se acabaram no mundo

José Saramago, As Intermitências Da Morte

I almost wish I hadn't gone down that rabbit hole - and yet - and

yet - it's rather curious , you know, this sort of life.

Lewis Carrol, Alice's Adventures in Wonderland

Page 14: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

1 Introduction

During the early user interface (UI) design phase, different design solutions

should be explored and iteratively refined by the design team. Sketching on paper

is acknowledged as a quick way to document ideas and design details before they

are forgotten. However, paper sketches are hard to be kept and modified,

particularly in a rapidly evolving scenario, which often results in having to redraw

the mockups several times (Landay & Myers, 1995). Incorporating the sketch‘s

natural, unconstrained and informal virtues, for which it is praised in many areas

(Kieffer, Coyette, & Vanderdonckt, 2010) into a computational solution could

make early prototyping easier and faster. As stated by Gross, ―certainly there are

many practical benefits to addressing and resolving the challenges of sketch-based

interaction and modeling: the design, graphics, and media industries depend

heavily on drawing, and being able to engage with artists and designers in their

(still) preferred medium of choice is a tremendous advantage‖ (Gross, 2009).

Even with the advantages of sketches, the mouse-based interaction is very

popular and well established amongst users. Due to the large number of years

spent using this technology, mouse-based computer interfaces for drawing

graphics are considered so efficient and ―natural‖ that, nowadays, changing from

mouse to pen would shift industry practices (Kurtenbach, 2010). Besides that,

most applications that come with pen-based computers such as Tablet PCs are still

designed focusing in the mouse-based interaction, usually ignoring the power of

pen strokes and sketching (Davis, Saponas, Shilman, & Landay, 2007).

However, as technology advances and bigger screens and portable devices

become more popular, the mouse supremacy may be at stake: it may still be the

predominant pointing device for conventional desktop Graphical User Interfaces

(GUI), but as new devices arise, the comprehension of new input methods is

becoming more important, amongst them, the pen input (Po, Fisher, & Booth,

2005). Developing a pen-based application implies in a paradigm break, going

beyond a simple redesign and actually rethinking the application to take

Page 15: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

15

advantage of the pen input‘s intrinsic capacity for rapid, direct, modeless 2D

expression (Gross, 2009).

This dissertation presents UISKEI (User Interface Sketching and

Evaluation Instrument), a tool being developed bearing in mind the use of a Tablet

PC, so as to explore how the pen can be used in a ―paperless‖ user interface

prototyping application. Maintaining the natural characteristic of drawing shapes

on paper, it adds the computational power of moving shapes, resizing them and

interpreting them as interface elements. More than an interface prototype drawing

tool, UISKEI also features the definition of the prototype‘s behavior, going

beyond navigation between user interface containers (e.g. windows, web pages,

screen shots) to allow changes to be made to the state of user interface elements

and widgets (enabling/disabling widgets, for example).

This dissertation also presents a study and comparison of similar tools

(Chapter 2) and the main concepts underlying UISKEI (Chapter 3). The user

interface drawing stage is detailed in Chapter 4, explaining how the conversion of

sketches to widgets is made by combining a sketch recognizer, which uses the

Douglas-Peucker simplification algorithm and the Levenshtein distance as a

similarity measure, and the interpretation of recognized sketches based on an

evolution tree. Chapter 5 discusses the different solutions explored to address the

issue of drawing an interface behavior, suggesting an innovative mind-map-like

visualization approach that enables the user to express the event, conditions and

actions of each interaction case, while still keeping the pen interaction paradigm

in mind. Chapter 6 explains how the end users can interact with the simulation of

the sketched user interface and thus help to evaluate its behavior. Chapter 7

describes a test to determine the values to be used in the recognizer. Chapter 8

presents an evaluation study of creating a prototyping — from the interface

building and its behavior definition to the prototype evaluation — using UISKEI

and two other techniques (paper prototyping and Balsamiq®

). The last chapter

(Chapter 9) suggests future work and presents the final conclusions of this

dissertation.

Page 16: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

2 Related work

Several tools can aid the prototyping stage, with many approaches available:

desktop or web-based applications, UI-specific or generic diagrammatic solutions,

mouse-based or pen-based interaction, amongst others. Lists of some products can

be found online, for example, in (Harrelson, 2009) or (Barber, 2009). An

overview of some widely used applications will be presented in the following

sections. Section 2.1 analyzes mouse-based prototyping software and Section 2.2,

pen-based ones. Section 2.3 presents a comparison table summarizing the

analyzed applications.

2.1 Mouse-based prototyping software

2.1.1. Microsoft Visio

Microsoft Visio1 is a tool that allows the creation of various types of

diagrams - from flowcharts to Windows-style user interfaces. For each type, a pre-

defined set of elements is presented and, by drag-and-drop, the selected ones are

added to the representation being built.

Although largely known and utilized, Microsoft Visio presents relevant

limitations. Since it handles a wide array of diagram types, it is a rather generic

solution to the UI prototyping problem. Therefore, specific behavior is not

supported, allowing only the navigation between screens. Its strength resides in

the fact that the mockup interface is really similar to the final result in a Windows

XP environment, as seen in Figure 1.

1 Information about the latest version can be found at

http://office.microsoft.com/en-us/visio/

Page 17: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

17

Figure 1: Microsoft Visio 2007.

2.1.2. Balsamiq

Balsamiq2 is a very popular UI prototyping software as of the writing of this

dissertation. It presents a collection of 75 elements that can be added to the

mockup, varying the complexity from simple buttons and labels to more complex

ones, such as a formatting toolbar or an iTunes-like cover flow.

Elements are added by drag-and-drop and the only possible interaction with

them is as hyperlinks between mock-ups (for example, a checkbox cannot toggle

its ―checked‖ state during the ―full-screen presentation‖). This limitation is a

major disadvantage of the software, since if the designer wants to create an

interactive prototype, he/she must build copies of the same interface with the

elements in different states and then create links between these mock-ups.

Moreover, not all elements can have hyperlinks associated to them. For instance,

cover flow, numeric stepper and playback controls cannot trigger navigation

actions.

Although not pen-based, Balsamiq‘s elements have a ―sketchy‖ look-and-

feel, which can be seen in Figure 2. This helps to enhance the conceptual

2 http://www.balsamiq.com/

Page 18: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

18

difference between ―prototype‖ and ―product‖ to the final user, thus ―can help to

disarm those who think that suddenly your software is ‗done‘‖ (Harrelson, 2009).

Figure 2: Balsamiq3.

2.1.3. Axure RP Pro

Another UI prototype tool is the Axure RP Pro4. It supports defining other

forms of interaction than only navigating between screens, allowing the

generation of a functional prototype with less effort. Its interface can be seen in

Figure 3.

Figure 3: Axure RP Pro5.

3 Image taken from:

http://balsamiq.wpengine.netdna-cdn.com/images/help_3mainareas.png 4 http://www.axure.com/

5 Image taken from:

http://www.axure.com/images/training-axurerpenvironment-interface.jpg

Page 19: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

19

The addition of elements is also done by drag-and-drop and there is an

extensive list of properties to be defined for each element. The form-based

paradigm extends to the definition of the interactive behavior, being necessary to

fill out some fields and requiring several mouse clicks, as can be seen in Figure 4.

Figure 4: Defining behavior with Axure RP Pro6.

2.2 Pen-based prototyping software

2.2.1. DENIM

DENIM7

(Lin, Thomsen, & Landay, 2002) explores the pen-based

interaction paradigm to aid the initial states of website development. Its main

characteristic is the different zoom levels to view the project, going from a macro

vision – the site map – to a micro vision – a single page. The pen strokes easily

create links between pages by dragging lines between them, as can be seen in

Figure 5, in which the ―Home‖ page links to the ―Weather‖ page.

6 Images taken from: http://www.axure.com/images/training-interactions-dialog.jpg

and http://www.axure.com/images/training-conditionallogic-multipleconditions.jpg 7 http://dub.washington.edu:2007/denim/

Page 20: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

20

Figure 5: DENIM.

DENIM also features a gesture system that naturally flows along with the

drawing of pages. If the user holds the pen‘s barrel button or the CTRL key, the

drawing will be interpreted as a gesture, following the language presented in

Figure 6. The gesture system allows frequent operations, such as undo/redo and

cut/copy/paste, to be executed directly from the canvas, without changing the

drawing paradigm by adding an implicit mode of interaction (a ―gesture mode‖

activated by the holding of the pen‘s barrel button or the CTRL key).

Figure 6: DENIM gesture system8.

8 Images taken from online documentation available at :

http://dub.washington.edu:2007/projects/denim/docs/HTML/quick_ref/gestures.html

Page 21: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

21

However, even heavily based on drawing, the addition of WIMP (Windows,

Icons, Menus and Pointers) elements still relies on specific tools that work as

―stamps‖, as can be seen in the lower bar of Figure 5. Another limitation is that

the only available actions are navigational (hyperlinks), but it is possible to make

a conditional navigational depending on the state of elements. Such conditionals

are displayed one at a time, without highlighting which component is related to

each conditional. As can be seen in Figure 7, the checkbox is responsible for the

two possible navigational paths, but it is not highlighted in any way.

Figure 7: DENIM‘s representation of conditionals9.

One nice feature is the idea of ―custom component‖, allowing the use of a

user-defined element in the application. The operations regarding custom

components — such as creating, adding and editing — are accessible through the

pie menu, so it is not possible to add these components through drawing or stamps

as the regular ones.

2.2.2. SketchiXML

The SketchiXML10

is a ―multi-agent application able to handle several kinds

of hand-drawn sources as input, and to provide the corresponding specification in

UsiXML― (Coyette, Faulkner, Kolp, Limbourg, & Vanderdonckt, 2004). It

focuses on UI sketching and has its own gestural language to add elements

through drawing, which can be seen in Figure 8.

9 Images taken from online documentation available at:

http://dub.washington.edu:2007/projects/denim/docs/HTML/tutorial/Using_Conditional. htm 10

http://www.usixml.org/index.php?mod=pages&id=14

Page 22: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

22

Figure 8: SketchiXML11

.

Not all elements can trigger actions. For instance, a button can trigger

multiple actions, but an image can trigger none. Moreover, these actions are

limited to navigation between screens. This behavior definition is done in the

―Navigate‖ mode, where the screens are presented as thumbnails in a 2-D space.

Then they can be organized by the user, since he/she is unaware of this 2-D space

when he is building the screens. The addition of actions explores the pen input,

since we need to draw a line connecting the element that will trigger the action to

the screen that will be shown. When a valid connection is available, the line being

drawn changes its color, giving feedback that an action can then be created. After

lifting the pen, a pop-up menu appears so the user can choose what action will be

created (open/close, minimize/maximize, bring to front/back). The creation of an

action therefore happens in two steps: drawing a line to determine the trigger

element and the target screen and then selecting the action from the pop-up menu.

A particular characteristic of SketchiXML is that it has different levels of

the mockup visual representation: the original stroke, a ―beautified‖ stroke, a

conceptual version of the element and the element as would be shown at the

interface of the current running operating system. The different representation

levels can be seen in Figure 9, starting from the top-left corner and going

clockwise:

11

Image taken from: http://www.usixml.org/images/sketchixml_09.png

Page 23: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

23

Figure 9: Different levels of representation in SketchiXML.

2.2.3. CogTool

CogTool12

is a software currently being developed by the Carnegie Mellon

University that, besides creating UI prototypes, ―automatically evaluates your

design with a predictive human performance model‖ (Carnegie Mellon

University, 2009). One difference from all other evaluated tools is that CogTool

supports different input devices (not only the usual keyboard and mouse, but also

touch screen and microphone) and audio as another output device in addition to

the monitor screen.

In CogTool, a project consists of frames, related to the windows of the

interface being designed. The elements are defined by drawing a rectangular area

that will be occupied by it. Despite having this rectangle drawing component, the

user must choose the corresponding tool to choose the element type.

Between frames it is possible to add a transition, having an event (such as a

mouse or keyboard input) associated to it. These transitions can be shown in a

storyboard-like fashion, allowing an overview of the project and the relations

between frames, as can be seen in Figure 10.

12

http://cogtool.hcii.cs.cmu.edu/

Page 24: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

24

Figure 10: CogTool.

Having defined the frames and transitions, it is possible to make a GOMS

task analysis simulation with a ―cognitive crash dummy‖ (as described in the

project‘s webpage), measuring the time elapsed in each step. In the end, a graph

summarizing the results is displayed, as shown in Figure 11.

Figure 11: Simulation in CogTool.

Page 25: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

25

Besides a good task analysis, the prototype is very simplified, since all user

interface elements are graphically represented by rectangles (bounding boxes)

with which users can interact. As the main focus is the automated task analysis to

be used by the design team, the choice for this simplified representation is

justified. However, if the prototype is presented to an end user, we believe that

this design choice of only displaying bounding boxes could result in some

confusion.

2.3 Summary

Table 1 presents a comparison of the tools described in this chapter. Each

lines represents a feature, showing whether the tool has (y) or does not have (n) a

certain feature. When a certain feature depends on another, it is indented in a tree-

like fashion with ―>>―. In this case, if the tool does not have the ―parent‖ feature,

it is marked with an ―x‖. Empty cells means that the features were not evaluated.

Table 1: Software comparison.

Mic

roso

ft V

isio

Axu

re R

P P

ro

Den

im

Sketc

hiX

ML

Bals

am

iq

Co

gT

oo

l

Free n n y y n y

UI-prototype exclusive (vs generic diagrammatic tool) n y y y y n

UI components for multiple environments (vs web-

page prototype only)y y n y y y

Drawing widgets n n n y n n

>> Evolution of widgets x x x n x x

Element manipulation y n n y y

Undo/Redo y y y y y

Group/Ungroup y y n y n

>> Select internal objects y x x n x

Cut/Copy/Paste y y y y y

>> Copies the action n n n y n

Zoom levels y y y y y

Guidelines n n n y n

Layer ordering y n n y y

Sketchy visual n n y y y y

Actions n y y y y y

>> Beyond navigation x y n n n n

>> Sketchy interaction x n y y n y

>> Event x y y n n y

>> Conditions x n y n n n

Prototype evaluation x y y n y y

Save y y y n y y

Page 26: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

26

As we will see in the next chapters, UISKEI targets the prototyping process

with a pen-based interaction approach, not only during interface building but also

when defining the interface behavior. Moreover, the behavior defined should go

beyond navigational purposes and be conditionally triggered, combination only

present in Axure RP Pro, but without the sketchy interaction.

Page 27: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

3 UISKEI

There is a common belief that to build a good user interface, we must refine

the solutions iteratively, testing with users to gather feedback and revise them as

many time as possible (Szekely, 1994). This ideal interaction design life cycle

encompasses a series of inter-related activities, which are part of a greater iterative

process. A simple model of this idea can be found in (Preece, Rogers, & Sharp,

2002) and is displayed in the figure below:

Figure 12: A simple interaction design model13

.

UISKEI (User Interface Sketching and Evaluation Instrument) was

developed to aid this life cycle in the early prototyping phase, considering that it

can be summarized in three major iterative stages:

1. Interface building The ―(Re)Design‖ stage, choosing which

elements compose the interface and their position, size, etc.;

2. Behavior definition The ―Build an interactive version‖ stage,

describing how the prototype works;

3. Prototype evaluation The ―Evaluate‖ stage, allowing end users to

interact with the prototype to gather their feedback.

13

Image 6.7, taken from Section 6.4.1 of (Preece, Rogers, & Sharp, 2002)

Page 28: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

28

During the early prototyping phase, sketching can be highly beneficial due

to its inherent speed - allowing rapid exploration and iteration of different ideas -

and ambiguity - allowing the designer to focus on basic structural issues rather

than unimportant details and also allowing multiple interpretations, which can

lead to new ideas (Lin, Thomsen, & Landay, 2002). Also, the freeform nature of

sketch allows the design to be more creative and exploratory than when using the

computer (Hong, Landay, Long, & Mankoff, 2002).

If pen-based interaction resembles the paper experience, why do designers

find it easier to sketch on paper? Hammond et al. explain this by claiming that ―a

gulf still exists between the sketch recognition system and the user. To the user, a

new mode of interaction is occurring, pen input; however, this conceptual model

is inaccurate as the computer still interprets the pen under the mouse/keyboard

archetype. No longer can the pen merely stand in for the mouse; rather, a new

paradigm of human-computer interaction must be designed around the pen and

recognition of the pen input. Pen-based interfaces should provide interpretation

and feedback in a natural and intuitive manner, rather than locking the user into

mouse-like interactions.‖ (Hammond, Lank, & Adler, 2010)

UISKEI tries to overcome this gulf, creating a rapid ―paperless‖ early

prototyping environment, granting the flexibility and speed of the pen and paper

version along with interesting computational features, such as moving and

resizing.

―Sketching is fundamental to ideation and design. (…) Designers do not

draw sketches to externally represent ideas that are already consolidated in their

minds. Rather, they draw sketches to try out ideas, usually vague and uncertain

ones― (Tohidi, Buxton, Baecker, & Sellen, 2006b). By relying on sketching, we

aim to stimulate the exploration of the solution space during the early phases of

design. This allows a low cost development of more design alternatives and the

possibility to refine them, increasing the chances of obtaining the design right

(Tohidi, Buxton, Baecker, & Sellen, 2006a).

Sketching UI designs has already shown advantages according to a study by

Kieffer et al.(Kieffer, Coyette, & Vanderdonckt, 2010), as follows:

―UI sketching is preferred over traditional interface builders,

especially by end users and could be performed at different levels of

fidelity without losing advantages;

Page 29: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

29

the amount of usability problems discovered with a sketched design

is not inferior to those corresponding to a genuine UI;

the expressive power of a sketched UI remains the same;

a sketched UI provides quantitative and qualitative results that are

comparable to traditional UI prototypes except that the cost is

reduced;

UI sketching encourages exploratory design and fosters

communication between stakeholders more than any other prototype;

flexibility is superior to UI builders, authoring tools, and paper

prototypes.‖

Aiming for the pen-based interaction and to aid in all the three stages of

prototyping, the main requirements of UISKEI are detailed below:

Interface building

o Produce mock-ups of graphical user interfaces through pen-

based interaction;

o Recognize and convert the sketched elements into interactive

elements of the interface model (widgets);

o Manipulate and edit the widgets

Behavior definition

o Define a case, composed by a set of conditions and actions

associated to an event;

Prototype evaluation

o Evaluate an interactive version of the prototype, in order to

realize formative evaluation during the design process;

UISKEI's early version will be discussed in Section 3.1 and the main focus

of the new version will be discussed in Section 3.2. The new version approach to

each stage will be detailed in further chapters: the interface building will be

described in Chapter 4, the interaction definition in Chapter 5, and the prototype

simulation in Chapter 6.

Page 30: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

30

3.1 Early version study

A previous version of UISKEI was developed in 2008 as an undergraduate

final project (Segura & Barbosa, 2008). This version already explored the drawing

of elements which are the basis for interface building, but the behavior definition

heavily relied on the form-based interaction, as can be seen in Figure 13.

Figure 13: Action Manager of UISKEI‘s early version.

To add a new behavior, the user should first select the element associated

with the event, then click in the ―Manage Actions…‖ button (the lower button on

the element‘s properties pane, pictured in the top-right portion of Figure 13) to

open the ―Action Manager‖ window. In this window (pictured in the left portion

of Figure 13), the user should add a new behavior case, by clicking in the ―Add

case‖ button and then define the conditions and actions associated to the behavior

case.

To add a condition, he/she should click in the ―Add condition‖ button and

then choose the parameters in two drop-down lists. To add an action, he/she

should first click in the ―Add action‖ button, then choose one action type by

selecting it amongst the available radio buttons. When the action type was chosen,

the parameter pane changed accordingly, as can be seen in the zoomed region of

Figure 13, which shows all the possible panes. With this description, it is possible

Page 31: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

31

to notice the amount of mouse clicks needed to define a behavior, completely

breaking the pen-based interaction paradigm of the interface building phase.

In addition, the recognition algorithm was hard-coded, restricting the known

shapes, and with a recognition rate around 58,6%. An early study (Segura &

Barbosa, 2009) considering the end user role and comparing the paper prototyping

evaluation technique to an interaction session supported by UISKEI showed that

UISKEI was generally well accepted by the study participants, thereby justifying

working towards a new version.

3.2 New version requirements

The main issue in the early version was the paradigm break between stages:

while the interface building was done in a pen-based style, the interaction

definition was done with lots of mouse clicks on a form. Therefore, the major

challenges we wanted to address in the new version were to define interaction in

the canvas and to show it to the user in a comprehensible way.

Another improvement was to make the software more customizable, by

allowing users to define the collection of elements that could be sketched and

recognized. This means that the collection of recognized shapes should be

customizable as well and the hard-coded recognition system should be revised.

Page 32: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

4 Drawing the interface

A prototype in UISKEI is composed by presentation units, which are user

interface containers that can represent a window or a webpage, for example.

Elements are then added to the presentation units. An element can be a widget

(recognized element) or a scribble (unrecognized element). The elements‘ creation

process by drawing them will be detailed in the next sections, covering all steps

from the user drawing a stroke, to the stroke being recognized as a shape and,

finally, to the element being created.

Section 4.1 discusses how UISKEI converts the ink stroke to a Segment

data structure. Section 4.2 describes how the shapes are defined as a string.

Section 4.3 presents the ElementDescriptor concept to define an element in

UISKEI. Section 4.4 details the recognition process. Finally, Section 4.5

enumerates the available elements' properties.

4.1 Segments

When the user starts drawing, he/she may express his/her idea using a single

line or several lines. The ink SDK considers that a stroke was created every time

the user lifts up the pen, even if he/she is in the middle of a drawing. Multiple

strokes can therefore frequently be combined into a single segment. For example,

a rectangle can be drawn using a single stroke or more than one, as seen in the

figure below (where the dot marks the initial point of each stroke):

Figure 14: Drawing a rectangle in three different ways.

As can be seen, the leftmost rectangle (Figure 14a) was drawn with a single

stroke, while the other ones were drawn with different combinations of two

Page 33: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

33

strokes: Figure 14b has the initial point of a stroke near the end point of the other,

whilst Figure 14c has the initial point and end point of the strokes near one

another. To simplify the shape recognition process, the strokes are converted into

a Segment, which is a list of points with a bounding box. When strokes can be

combined, like in Figure 14b and Figure 14c, they are merged into a single

Segment, only having to change the stroke‘s direction if needed (as happens in

Figure 14c).

Though merging solves part of the problem, the user can still have a

drawing with multiple strokes that can‘t be combined into a single one. For

example, if he/she wants to write the word ―test‖ or draw a square with an ―X‖

inside, he/she may end up with three strokes in each drawing that should not be

combined, as illustrated in the figure below.

Figure 15: Multiple strokes that should be grouped but not merged.

Besides merging strokes into Segment, it is also needed to group strokes

that can‘t be merged. Therefore, a MultipleSegments is a list of Segment

with a bounding box.

When the user draws a stroke, it is converted to a Segment and added to

the current MultipleSegments being drawn, with three possible outcomes:

The new Segment is merged with another Segment already in the

MultipleSegments (when its initial point is close enough to the

initial/end point of the other segment, like Figure 14b and Figure

14c);

It is added to the MultipleSegments (when it couldn‘t be

merged to another Segment, but it is still close enough to be

considered part of the drawing, like in Figure 15);

It can‘t be added.

When the new Segment cannot be added to the current

MultipleSegments (or there is none), the current MultipleSegments is

ready to be recognized and converted to an element (as will be show in Section

4.2) and a new MultipleSegments is created with the new Segment.

Page 34: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

34

Another way that the current MultipleSegments may enter the recognition

process is when the user changes the current presentation unit or takes too long

between strokes, since we consider that the user finished his/her drawing after a

certain amount of time of inactivity.

4.2 Shapes

In order for the recognition process to take place, the user drawings (already

converted to Segment and MultipleSegments) must first be associated to a

known shape. In UISKEI, a Shape is defined by the series of stroke directions

that makes its drawing, similar to the work of (Cha, Shin, & Srihari, 1999). The

directions can be one of the four cardinal directions (N, S, E, W) or the four

ordinal directions (NE, NW, SE, SW). To simplify the notation, each direction

was associated with a single character, as shown in the following figure:

Figure 16: The directions compass rose.

Based on this abstraction, each shape of p points can be denoted as a string

of p-1 characters. But a shape can be drawn in different ways and the application

should respond to how the drawing looks, not how it was made (Sezgin,

Stahovich, & Davis, 2006). Taking for example the rectangle, the drawing has 5

points (to close it, the end point should be in the same position as the first point),

so it can be expressed as a string of 4 characters, as illustrated in the following

figure:

Figure 17: The rectangle shape as a string.

Page 35: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

35

As can be seen, a shape can be expressed by different combinations of its n

directions. If the shape is open (i.e. the end point is different from the initial

point), like a ‗\_|‘ shape, it has 2 variants: drawing from left to right (―DCA‖) or

drawing from right to left (―EGH‖). If the shape is closed, as in the rectangle

example, it depends on the starting point of the drawing (the columns in Figure

17), having n possibilities, and the direction of the drawing (the rows in Figure

17), having 2 variations (clockwise and counter-clockwise) to each previous

possibility, summing up to 2n variations.

When a user wants to create a new shape, he/she must create a text file with

only the ―base case‖ of the shape and whether it is closed or not, name the file

with the name of the shape and with a .shp extension and place it in a specific

directory. The text should follow this pattern, where the first term indicates if it is

a closed shape (y) or not (n) and the second term is the string of the ―base case‖:

[y/n] ([A,B,C,D,E,F,G,H]+)

UISKEI will look for available shapes when building the shape library at

load time, creating all the variations in the process. For example, the

Rectangle.shp file should contain only the line

y CEGA

and UISKEI would generate the 8 possible strings (―CEGA‖/‖ECAG‖,

―ACEG‖/‖GECA‖, ―GACE‖/‖AGEC‖, ―EGAC‖/‖CAGE‖).

4.3 Element descriptors

Besides a list of Shapes, the recognition process also needs a list of

ElementDescriptors. An ElementDescriptor defines many

characteristics of a recognized widget, such as how it is drawn, what its possible

states are (to be described later, in Section 4.5), how it handles events, which

events it handles, and also how the elements can be recognized.

By having a descriptor, the behavior of an element is delegated to it,

allowing the customization of the known widgets. Ideally, the user should be able

to implement new widgets by creating new ElementDescriptors. However,

the definition of a description language to allow the customization of widgets,

Page 36: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

36

such as described in (Hammond & Davis, 2006), lies outside the scope of this

dissertation. Consequently, the ElementDescriptors were hard-coded,

allowing for easier manipulation during the development phase and shedding

some light on which operations and operators the descriptor language should

support.

At the moment, UISKEI supports the following elements:

Button

Checkbox

DropDown

Frame

Label

Radio

Spinbox

Textbox

Since the creation of the elements is done through drawing, the

ElementDescriptor should know how the element is drawn. In order to do

that, it uses the Shape name and may apply some restrictions to it, such as a limit

in height and/or width and whether it was drawn inside a specific element. The

last restriction is responsible for the ―evolution‖ of elements, a characteristic

unique to UISKEI among the researched tools. A summary of how the widgets are

created is seen in Figure 18:

Figure 18: Language to create elements.

Page 37: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

37

While in recognition mode, a small circle will be converted into a radio

button and a small square, to a checkbox, for example. Figure 18 shows how both

restrictions work: the size restriction is what determines if the rectangle is a

checkbox, a button or a frame, and the evolution restriction determines if the

horizontal line is a label or a textbox.

4.4 Recognition

When a MultipleSegments enters the recognition process, each of its

Segments is simplified and converted into a string, following the same notation

described in Section 4.2. The simplification process uses the Douglas-Peucker

algorithm (Douglas & Peucker, 1973) to find the drawing‘s significant points,

reducing noise as can be seen in Figure 19, and a tolerance regarding the size of

which line segments should be converted into a ―direction character‖.

Figure 19: Douglas Peucker algorithm14

14

Image taken from: http://softsurfer.com/Archive/algorithm_0205/algorithm_0205.htm#Douglas-Peucker Algorithm

Page 38: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

38

Then, this string is compared to all strings corresponding to each preloaded

shape using the Levenshtein distance algorithm (Gusfield, 1997, p. 215), using the

difference between directions as a cost to the algorithm‘s operation. The overall

distance between the Segment‘s string and the shape‘s string is calculated

proportionally to the segment‘s length, so that longer Segments, more prone to

noise and less accurate, can still be recognized.

If the smallest distance (i.e., the best match) is less than a threshold, the

Segment is associated to the Shape. If an association is made, it runs through

the list of loaded ElementDescriptors to check for the first possible match.

For the elements‘ evolution to work, the list should be ordered by the descriptor

complexity: if it does not evolve, its complexity is 1; otherwise its complexity is 1

+ the complexity of the ―ancestor‖ element. This guarantees that the most

complex elements will be checked first, ensuring that, for example, a horizontal

line will generate a label only if it is not possible to generate a textbox with it.

If no association to a Shape is made or no ElementDescriptor

matches the configuration, the Segment remains as it was drawn (an

unrecognized element is called a Scribble). This way, the user can create and

manipulate any kind of new element, even if it is not turned into a widget, making

the software more flexible.

4.5 Element properties

Each element, no matter whether recognized, unrecognized, or a group, has

a number of properties which will be defined in the next sections.

Name: The element‘s name is how the designer will reference the

element throughout the design process. By default, each element will

be created with a name in the form <type of element><id>

(e.g.: Button1, Checkbox4, Radio42), assigning a unique name to

each new one in the same presentation unit (elements from different

presentation units may have the same name).

Page 39: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

39

Label: The label is an auxiliary text that accompanies the element

and varies its position accordingly, as can be seen in Figure 20 (the

names of the elements were written as their labels):

Figure 20: Labels.

Position: The position (x,y) in the presentation unit, always

referring to the top-left position of the element related to the top-left

of the presentation unit, disregarding the element‘s label.

Size: The element‘s width and height, disregarding the element‘s

label dimensions. Some elements are created with fixed values in

either direction, regardless of their drawing. For example,

checkboxes are always created with the same width and height,

while buttons are always created with the same height to help to

establish a more consistent look and feel.

Enabled / Disabled: This relates to how the element is first

displayed during a simulation with the client. When the element is

disabled, its representation is grayed, so the designer can know what

the initial configuration is.

Visible / Invisible: Similar to enabled/disabled, indicates if the

element is visible in the beginning of the simulation. If the element

is invisible, it is drawn in a light shade of blue in the ―design view‖

and it does not appear to the client during the simulation. If the

element is disabled and invisible, the representation of invisible is

the one shown.

States: States are possible values for the elements. Each one has its

own set, expressed in the following table:

Page 40: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

40

Table 2: Elements‘ states.

Element Number of states States

Button No states

Checkbox 3 states Checked / Unchecked / Mixed

DropDown Custom states A string that the user can select later

Frame No states

Label No states

Radio 2 states Checked / Unchecked

Spinbox 2 states Up / Down

Textbox 3 pre-defined +

Custom states

Blank / Valid / Invalid

A name, a pattern and a sample text

In the textbox case, the states are defined in a table-like fashion.

Each one has a name, which is how the state is presented to the

designer, may have a pattern, which is used in the simulation to

propose the state after the user‘s input of text, and may have a

default text, which is the text to appear at the initial state. The

pattern is a regular expression, which will be used when a text is

entered in the textbox to determine its new state.

Initial state: The state shown to the user at the beginning of the

simulation.

Page 41: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

5 Drawing the behavior

To define the behavior of the prototype, the designer must think about the

event associated to the behavior and which element triggers it, the conditions

under which the behavior will happen and which actions will be executed. This

ECA (event, conditions and actions) represents a single case of the element‘s

behavior, since another set of conditions can trigger a different set of actions. This

chapter explains both the ECA logic (Section 5.1) and how to define an ECA in

UISKEI (Section 5.2).

5.1 ECA

To define the interface behavior, we needed a language to express how to

react to elements‘ events (e.g. mouse clicks, text input), empowering the designer

to choose what to do when the interface is at a given state. We found that ECA

(Event Condition Action) languages could be an intuitive and powerful paradigm

to this situation (Alferes, Banti, & Brogi, 2006). They are used in many

applications — such as in active databases, workflow management, network

management, etc. Also, they have a declarative syntax being more easily analyzed

when compared to implementing with a programming language (Bailey,

Poulovassilis, & Wood, 2002).

In UISKEI, an ECA represents a behavior case, following the idea of ―when

an <event> happens, if all <conditions> are satisfied, then the

<actions> take place‖. It has a name that the user may enter to make the

association to the behavior case defined clearer (the default name is ―<element

associated to event> - ECA <number>―, the event which will trigger

it, a list of conditions and a list of actions. All the ECAs are shown in the

ECAMan sidebar (ECAs Manager), allowing the user to rapidly see them, as

shown in Figure 21.

Page 42: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

42

Figure 21: ECAMan interface.

As can be seen in the right pane, the ECAMan sidebar shows all ECAs in

the project (lower list) and gives the details of the current selected ECA, allowing

users to edit the ECA name and see the list of conditions and actions. The events

will be discussed in the following section, while conditions will be discussed in

Section 5.1.2, and the actions in Section 5.1.3. In Section 5.1.4, the concept of a

"valid ECA" is introduced.

5.1.1. Events (When)

An event is the action which may trigger an ECA. Each element has its

own pre-defined events. While buttons, checkboxes and radio buttons can handle

the ―Clicked‖ event, textboxes can handle the ―Text Changed‖ event and

the dropdown lists can handle the ―Selection Changed‖ event. For now, an

element can only have one event, but future work may overcome this limitation.

5.1.2. Conditions (If)

Conditions are expressions that must be satisfied in order to trigger an

ECA. The conditions currently supported by UISKEI are related to the status of an

element and are described below:

Element conditions

o If the element is in state <state name>

Page 43: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

43

o If the element is <visible / invisible>

o If the element is <enabled / disabled>

An ECA‘s set of conditions has an internal operator of ―AND‖, so all

conditions must be met to activate the ECA. The ―OR‖ operator can be achieved

by defining a new ECA. Ideally, the ECAs of an element should be mutually

exclusive, since if two or more of them can be activated in a given situation, only

the first one in the list will do so. Therefore, the ECAs‘ order in the ECAMan is

important to the simulation.

5.1.3. Actions (Do)

An action is an operation that will be performed if activated, changing

the simulation state. While conditions may only refer to elements, an action can

refer to an element, to a presentation unit or to a default kind of message. The

operations available to each group can be seen in the following list:

Element actions

o Change element to state <state name>

o Make the element <visible / invisible>

o Toggle the element‘s visibility status

o Make the element <enabled / disabled>

o Toggle the element‘s enabled status

Presentation unit actions

o <Change to / Pop up / Modal pop up> the

presentation unit

Default message actions

o Show a <information / warning / error>

message with <text>

When an ECA event is triggered, if the set of conditions is satisfied, all of its

actions are executed.

Page 44: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

44

5.1.4. Valid ECAs

When an element is removed from the project, all ECAs triggered by the

element‘s event are also removed. However, it could also be associated to

conditions or actions of other ECAs, but the remaining of these ECAs could still be

valid. The same issue may happen to removed presentation units and removed

states. To avoid greater impact, such as also removing the ECAs with the changed

conditions or actions, the concept of ECA validity was created.

When a condition or action references a removed object, it becomes invalid,

invalidating the ECA that contains it. An invalid ECA continues in the ECAMan, so

the user can later make changes to make it valid again, but it is ignored during

simulation, to avoid undesired or unplanned behavior.

5.2 Drawing ECAs

As could be seen in Figure 13, the first version of UISKEI had a form-based

approach to handle ECAs, forcing users to switch the interaction paradigm

between the interface design mode (pen-based) and the interaction design mode

(form-based). One of the new version‘s biggest challenges was making this step

more adequate to pen-based interaction.

The first idea was based on the DENIM approach of navigation between

pages, by connecting lines between an element and a page. This is effective when

there is only one kind of behavior (when <element> is clicked, go to <page>)

and two parameters (the beginning of the line defines the anchor <element>

and the end, the target <page>. UISKEI, however, offers 3 different kinds of

actions, and both conditions and actions have additional parameters to be defined,

so a more complex approach was necessary.

The proposed solution was to create a specific mode of interaction, Eca

Mode (as opposed to the Drawing Mode / Recognition Mode used to

create the interface). While in this mode, the user must focus on an element –

which will be the one that raises the ECA‘s event – and add an ECA to it. The

manipulation of ECA‘s (adding, removing, duplicating, activating the next or the

Page 45: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

45

previous ECA) is done by clicking buttons that appear on the canvas when an

element is focused, as shown below:

Figure 22: ECA buttons.

Conditions and actions can be added to the active ECA by drawing lines,

with the start and the end point determining what is being added. The first idea

was simple: if the line began on the focused element (the element associated with

the ECA event), it was a condition, if it ended on it, an action. However, this

solution demanded too much effort from the user, since it was necessary to draw

long lines; was much too complex if an element of a different presentation unit

was involved and made it impossible to have an action or condition associated to

the focused element itself, since the start and end points would always be in the

element, making the user‘s goal unknown. Another solution was proposed,

following the table below:

Table 3: Condition and action creation.

Start point End Point Creates

Element Anywhere Element condition

Canvas

Element Element action

Presentation Unit Presentation unit action

None of the above Default message action

Analyzing the start and end points we can only define which ―higher level‖

type of operation is being added, i.e., if it is an element condition, an element

action, a presentation unit action or a default message action. The specific

operation and its parameters are still undefined. For example, if an element

condition is being created, we do not know if it is one of kind ―is in state <state

name>― or ―is <visible / invisible>― with only the start and end points.

So, we need to ask the user what operation he/she wants (e.g. is in state <>) and

the parameters (e.g. <state name>).

Page 46: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

46

To make the process more fluent and pen-like, it was decided to use a pie

menu approach, since it reduces target seek time, ―lowering error rates by fixing

the distance factor and increasing the target size in Fitts‘s Law‖ (Callahan,

Hopkins, Weiser, & Shneiderman, 1988). The pie menu pops up once the pen

stops for a given amount of time, determining the end point of the line. Without

raising the pen, the user chooses the last parameters of the condition or the action

being created. Therefore, one single stroke can define a condition or an action, as

can be seen in the following image:

Figure 23: Pie menu allows a single stroke to define all parameters.

The default message action is defined with only one pie menu, but an

additional dialog is necessary to define the text that will be shown.

Every element condition and action is defined through a two-level pie menu.

Since an element can have a variable number of states, the choice of state was

postponed to the second level of the pie menu. This decision was made in order to

guarantee that the first level of the pie menu would always look the same, making

it more recognizable to the user and, thus, more efficient. The visibility and

activity status, although having a fixed set of operations, has different numbers of

operations depending if it is a condition (2 options: is visible / is invisible, is

enabled / is disabled) or an action (3 options: make visible / make invisible /

toggle, make enabled / make disabled / toggle). So, to keep the menu coherent and

make the first level the same between conditions and action, the options were

organized in the second level. The final pie menu hierarchy when creating a

condition or an action can be seen in the following figure.

Page 47: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

47

Figure 24: Element condition (left) and action (right) pie menus.

The presentation unit action is defined with only one pie menu, but the

problem was that the list of presentation units didn‘t appear in the canvas, being a

drop-down list in the lower bar of the window. A first idea was to follow the

DENIM approach and have multiple zoom levels, but this would mean that the

user should have a two-dimensional space awareness of where his/hers

presentation units were. To make it similar to the dropdown list available off the

canvas, a one-dimensional filmstrip was added to the canvas, showing a

thumbnail of every presentation unit and allowing the user to scroll among them.

The presentation unit filmstrip can be seen in Figure 25, along with the

presentation unit action pie menu.

Figure 25: Presentation unit filmstrip.

Page 48: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

48

The conditions and actions were added, but how should they be represented

on the canvas? DENIM showed that multiple conditions could lead to a

combinatorial explosion, as can be seen in Figure 26 taken from (Lin, Thomsen,

& Landay, 2002). Note that there are multiple copies of a page, each one

representing a combination of its possible states. Also, the usage of the same

representation to indicate either interaction or navigation adds to the confusion

and may further hinder the design process.

Figure 26: DENIM combinatorial explosion.

The first decision to overcome this problem was to show only one ECA at a

time, avoiding screen pollution with the cost of losing sight of the ―big picture‖.

The initial idea was to display the same lines used in the creation of conditions

and actions, but this idea was abandoned. First, because it would also make the

screen polluted, as there could be many lines, connecting different presentation

units or distant elements. Also because a single element could be associated with

different conditions and actions, so it was necessary to make a distinction between

them.

Instead of lines connecting objects, the proposed solution uses a mind-map-

like representation. Reflecting the ―if / when / do‖ of an ECA, the ―mind-map‖

shows a block of connections on the left to represent the conditions and a block of

connections on the right for the actions. In the middle, the ―when‖ block,

displaying the active ECA and its associated event. By doing so, it is possible to

see all the ECA‘s conditions and actions in an ordered and centered manner,

making it easier to view, comprehend and edit.

Page 49: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

49

Originally, these blocks appeared over the focused element, but this

occluded the elements behind it, making it difficult or even impossible to

associate conditions or actions to the occluded elements. So, a special area is now

reserved for the active ECA on the top of the canvas, making it always visible

even when the canvas is scrolled.

The approach of drawing lines to define conditions and actions raised an

issue: while in ECA mode, the user should be able to manipulate (move and

resize) the focused element and to make it part of a condition or an action. If the

user clicks inside the selected element and drags it, the element should move.

Since the language to create an element condition is to ―draw a line from the

element to the canvas‖ and this results in moving the element, the solution was to

use the ―when‖ block as a hotspot for the focused element. So, if he/she clicks

inside the ―when‖ block and drags to its outside, a condition related to the selected

element is created, thus solving this impasse.

Finally, to give users a feedback of which objects are related to the active

ECA, they are painted with a color code to specify if they are related to a condition

(green), an action (orange) or both (brown), as shown in Figure 27. The command

button is in green (indicating it is part of a condition), the radio button and the

target presentation unit are in orange (indicating they are part of an action), and

the checkbox is in brown (indicating that it is both part of a condition and part of

an action).

Figure 27: ECA mind-map-like representation.

Page 50: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

6 Prototype evaluation

The steps described so far in Chapters 4 (Drawing the interface) and 5

(Drawing the behavior) relate to the designer’s perspective, the one who creates

the prototype. However, the sketched mockup should also be evaluated according

to the end user’s (client’s) perspective.

In order to do so, UISKEI provides a simulation mode, which presents a

functional version of the prototype that the user can interact with, in a similar

fashion to the final product. The prototype keeps its sketchy look-and-feel, since

studies show that testing with rough prototypes does not interfere with the

discovery of usability problems when compared to more finished ones (Lin,

Thomsen, & Landay, 2002). This early stage prototyping can help designers

obtain invaluable feedback, with potential value in the later stages as well

(Hundhausen, Balkar, Nuur, & Trent, 2007).

A login screen example can be seen in Figure 28.When the user hovers the

pointer over an element, it becomes blue, to give feedback about where the pointer

is. In the example, it is possible to see that when the user clicks on a textbox, a

text input field appears, allowing the user to enter any text. The TextEntered

event is only triggered when the element loses focus, so the pattern matching only

occurs after all text is written and considered finished. At this time, there is no

special treatment for the textbox input. For instance, there is not a flag indicating

whether the textbox is a ―password textbox‖. In the example, the text entered in

the password field was actually a sequence of asterisks.

Then, when the user clicks on the ―Groups‖ dropdown, the dropdown items

appear. Again, the SelectedItem event only happens after the user makes a

selection. Finally, when the user clicks on the ―Login‖ button, it triggers the

Clicked event, which is associated to an action to display a default message.

Page 51: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

51

Figure 28: Prototype evaluation example.

Each simulation has its own manager, SimulationManager, so it is

possible to run more than one independent simulation at the same time. This

SimulationManager references the current‘s project ECAMan, so if changes

are made to ECAs on the designer view during the simulation, they are reflected

in the simulation. This was planned as a basis for a future Wizard of Oz approach,

which will be discussed later in Section 9.4. Besides updating ECAMan, it could

also reference the current presentation unit, so changes in the elements‘ list could

be reflected on-the-fly.

Despite being a functional prototype, UISKEI lacks some basic features,

such as switching the focused widget with the TAB key, for example. At the

present time, only the events discussed in 5.1.1 are handled. Similar to the

Page 52: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

52

element descriptors, the simulation representation is hard-coded according to the

event type. If new widgets were added, that share the same already existing events

(for example, a list should also have the SelectedItem event), this simulation

dynamics and representation should also be defined in the descriptor as well.

Another reason for delegating this to the descriptors is because a widget can

handle more than one event, depending on how or where it was activated. For

example, a numeric spinbox can have the TextEntered event, when the user

inputs the value through the keyboard in the text input area, and the

ValueChanged event, when the user changes the value by using the up/down

arrows.

As could be observed, the simulation does not follow the pen-based

interaction paradigm and rather falls in a mouse-based paradigm. Although this

may be a valid critique, UISKEI‘s main intent is to explore the pen during the

designer‘s phase, not during the prototype evaluation. So, the paradigm break is

acceptable, particularly because it happens together with the designer/end user

perspective change.

Page 53: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

7 Recognizer test

To evaluate the recognizer and try to find the combination of parameters

that results in the best recognition experience, we have run a few tests. A dataset

composed of 1357 shapes was created, containing all currently recognizable

shapes and organized in ―test cases‖: 335 radios (circles), 76 horizontal lines, 171

vertical lines, 81 horizontal zig-zags, 139 vertical zig-zags, 144 triangles, and 411

rectangles (distributed in three test cases with various size: 207 checkboxes, 160

buttons, and 44 frames). The dataset was then tested with different configurations

of the recognizer, changing a number of parameters described below:

Douglas-Peucker simplification algorithm usage the recognizer

may or may not use this simplification algorithm.

Douglas-Peucker tolerance (DP Tol.) if the recognizer uses the

Douglas-Peucker algorithm, the tolerance used in the algorithm can

vary. We tested values ranging from 1 to 14.

Direction tolerance (Dir. Tol.) when the line segment is converted

into a string, only the directions determined by points distant from

each other by a certain distance should be converted into a character

direction. We tested this threshold varying it from 0 to 12.

Levenshtein cost the Levenshtein edit distance algorithm

calculates the edit distance between string A and string B by

performing the operations of insertion, deletion and substitution.

Each operation has its own cost and the best result is given by the

sequence of operations that sums up to the lesser cost. The algorithm

creates a dynamic table where each cell (i,j) contains the best

solution to the distance between substrings A[i] and B[j]. The last

cell (where i is the length of string A and j is the length of string B)

then gives the best solution for the whole string. Since the algorithm

is based on filling up a table, the operations can be interpreted as the

directions we travel in the table: the insertion operation is when we

Page 54: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

54

go up, the delete operation when we go left and the substitution

operation when we go diagonally. The operations‘ costs can be

determined in different ways and we tested the following methods:

o Costs equal to the diagonal difference all the costs are the

same and equal to the difference between character A[i] and

B[j]:

diagCost = Math.Abs(A[aInx] - B[bInx]);

leftCost = Math.Abs(A[aInx] - B[bInx]);

upCost = Math.Abs(A[aInx] - B[bInx]);

o Diagonal cost 1 or 0, left and up 1 the cost of going

diagonally depends on if the characters are equal, while the

costs of going left or up is always equal to 1:

diagCost = A[aInx] == B[bInx] ? 0 : 1;

leftCost = 1;

upCost = 1;

o Cost based on each direction and base case 0 in this case

we compare the cost to the characters in the string: if we go

diagonally, we compare between the strings, if we go left, we

compare the current character in string A to the previous on

in the same string, if we go up, we compare the current

character in string B with the previous in string B. If there is

not a previous character, the cost is 0:

diagCost = Math.Abs(A[aInx] - B[bInx]);

leftCost = Math.Abs(A[aInx] - A[aInx-1]) : 0;

upCost = Math.Abs(B[bInx] - B[bInx-1]) : 0;

o Cost based on each direction and base case 1 this case is

similar to the previous one, but if there is not a previous

character, the cost is 1:

diagCost = Math.Abs(A[aInx] - B[bInx]);

leftCost = Math.Abs(A[aInx] - A[aInx-1]) : 1;

upCost = Math.Abs(B[bInx] - B[bInx-1]) : 1;

The distance we use, as described earlier in Section 4.4, is proportional to

the segment‘s length, i.e. the distance is the distance calculated by the

Levenshtein algorithm divided by the segment‘s length. With this we aim to

reduce the effect of longer segments having more noise and being more error

Page 55: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

55

prone. With the different costs tested, the distance does not necessarily fall within

the range [0.0,1.0], since there are costs that may be greater than one. For

example, when we use the Math.Abs(A[aInx] - B[bInx]), the resulting

value can be 0, 1, 2, 3 or 4 (considering the 8 possible directions).

This test focused on associating each drawing of the dataset to a shape, i.e.,

the shape with the lesser distance from the drawing. To each recognizer

configuration, we obtained the number of successful associations and the

percentage from the total it represents. Another statistic obtained is the average

shape success percentage and standard deviation, calculated by summing up each

test case success percentage and dividing by the number of test cases. This value

will indicate how well the configuration performed for the different shapes and

sizes (the test cases) and if it was uniform between test cases, since a recognizer

configuration can perform well with checkboxes rectangles but poorly with frame

rectangles, for example.

Also we calculated an average shape success distance, by summing up each

test case success distance and dividing by the number of test cases. This value

cannot be compared between different configurations, since depending on how the

Levenshtein‘s costs are calculated, the distance may be greater than when

compared to other costs. This value is only an indicative of which value we

should use to consider that the association to a shape was in fact a successful

recognition.

We tested 780 different recognizer configurations in total. We will only

present the top 10 results, first ordered by total success percentage (Table 4) and

later by average shape success percentage (Table 5).

Table 4: Top 10 recognizer configuration results organized by success percentage.

Dir. Tol.

Levenshtein Costs

DP Tol.

Success #

Success %

Average Shape

Success %

Std Dev Shape

Success %

Ave. Shape Dist.

9 equal to diag. diff. 1 1243 91.60% 93.29% 0.051 25.83

8 equal to diag. diff. 1 1240 91.38% 91.42% 0.061 27.11

9 equal to diag. diff. 2 1234 90.94% 93.55% 0.063 27.11

10 equal to diag. diff. 1 1231 90.71% 93.74% 0.079 23.89

9 equal to diag. diff. 3 1226 90.35% 93.14% 0.070 26.93

9 equal to diag. diff. 4 1225 90.27% 93.11% 0.072 27.27

8 equal to diag. diff. 2 1222 90.05% 91.25% 0.071 27.83

10 equal to diag. diff. 2 1220 89.90% 93.96% 0.096 25.08

8 equal to diag. diff. 3 1219 89.83% 91.06% 0.073 27.97

8 equal to diag. diff. 4 1219 89.83% 91.09% 0.074 28.22

Page 56: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

56

Table 5: Top 10 recognizer configuration results organized by average shape success

percentage.

Dir. Tol.

Levenshtein Costs

DP Tol.

Success #

Success %

Average Shape

Success %

Std Dev Shape

Success %

Ave. Shape Dist.

10 equal to diag. diff. 2 1220 89.90% 93.96% 0.096 25.08

11 equal to diag. diff. 1 1214 89.46% 93.88% 0.110 22.61

10 equal to diag. diff. 1 1231 90.71% 93.74% 0.079 23.89

9 equal to diag. diff. 2 1234 90.94% 93.55% 0.063 27.11

10 equal to diag. diff. 3 1218 89.76% 93.47% 0.096 25.53

11 equal to diag. diff. 3 1182 87.10% 93.35% 0.143 22.63

11 equal to diag. diff. 2 1185 87.32% 93.33% 0.142 22.15

9 equal to diag. diff. 1 1243 91.60% 93.29% 0.051 25.83

9 equal to diag. diff. 3 1226 90.35% 93.14% 0.070 26.93

9 equal to diag. diff. 4 1225 90.27% 93.11% 0.072 27.27

The first observable conclusion is that the Levenshtein costs with best

results was the first one described, with the cost equal to the diagonal difference.

Another observation is that, although the use of the Douglas-Peucker algorithm

provided better results, the tolerance used is small. This conclusion needs further

investigation, since the recognizer with the same configuration of the best result in

Table 4, except for the use of Douglas-Peucker, obtained a 83.27% success

percentage.

A close-up investigation of the two best configurations can be seen in the

following tables. In them, each column represents a test case. The first block of

shadowed rows is a summary of the results: the first row shows the number of

shapes, the second row shows the number of shapes associated correctly and the

following rows show the success percentage, the successful associations‘ average

distance and the standard deviation of such distances. The next block of rows

provide a detailed investigation of the associations made: each block of rows

represents a shape (displayed in the first column) and to each shape we show the

number of associations made, the average distance and the standard deviation.

The cells in boldface highlight the successful associations.

For example, if we take Table 6 and analyze the Radio test case (by

observing the column marked with ―Radio‖), we can see that there were 335

shapes in the dataset that were drawn as radio buttons (i.e. drawn as small circles).

280 of these were successfully associated to the shape ―Circle‖, which represents

a success rate of 83.58%. Amongst the successes, the average distance was 0.40

with a standard deviation of 0.11. Looking closely at the results, we can see that

Page 57: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

57

the 55 circles that were not associated correctly (335 shapes - 280 successes) were

mostly recognized as rectangles. Actually, 49 circles were incorrectly associated

to rectangles, with an average distance of 0.36 and a 0.10 standard deviation. 5

other circles were considered triangles and 1 circle was considered a horizontal

zig-zag.

Table 6: Shape analysis for best success percentage configuration.

Radio

Label

HorizontalZZ

Checkbox

Button

Frame

Triangle

Vertical

VerticalZZ

Shapes # 335 76 81 207 160 44 144 171 139

Success # 280 73 80 197 147 42 133 171 120

Success % 83.58% 96.05% 98.77% 95.17% 91.88% 95.45% 92.36% 100.00% 86.33%

Success

Ave. Dist.0.40 0.12 0.09 0.08 0.17 0.08 0.15 0.00 0.35

Success

Std. Dev.0.11 0.24 0.16 0.13 0.19 0.10 0.22 0.00 0.20

# 280 0 0 6 8 2 0 0 0

Ave. Dist. 0.40 0.00 0.00 0.37 0.33 0.23 0.00 0.00 0.00

Std. Dev. 0.11 0.00 0.00 0.16 0.14 0.11 0.00 0.00 0.00

# 0 73 0 1 0 0 0 0 0

Ave. Dist. 0.00 0.12 0.00 0.71 0.00 0.00 0.00 0.00 0.00

Std. Dev. 0.00 0.24 0.00 0.00 0.00 0.00 0.00 0.00 0.00

# 1 0 80 1 0 0 0 0 0

Ave. Dist. 0.71 0.00 0.09 0.57 0.00 0.00 0.00 0.00 0.00

Std. Dev. 0.00 0.00 0.16 0.00 0.00 0.00 0.00 0.00 0.00

# 49 0 0 197 147 42 11 0 6

Ave. Dist. 0.36 0.00 0.00 0.08 0.17 0.08 0.29 0.00 0.77

Std. Dev. 0.10 0.00 0.00 0.13 0.19 0.10 0.07 0.00 0.08

# 5 3 1 2 5 0 133 0 2

Ave. Dist. 0.58 0.47 0.60 0.25 0.60 0.00 0.15 0.00 0.79

Std. Dev. 0.07 0.05 0.00 0.08 0.12 0.00 0.22 0.00 0.04

# 0 0 0 0 0 0 0 171 11

Ave. Dist. 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.77

Std. Dev. 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.02

# 0 0 0 0 0 0 0 0 120

Ave. Dist. 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.35

Std. Dev. 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.20

Ve

rtic

alV

ert

ical

Zig-

Zag

Cir

cle

Ho

rizo

nta

lH

ori

zon

tal

Zig-

Zag

Re

ctan

gle

Tria

ngl

e

Page 58: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

58

Table 7: Shape analysis for best average success percentage configuration.

Comparing the overall results, it is possible to see that the results shown in

Table 7 were better than those shown in Table 6, except for the ―Radio‖ case. This

shows that the circle case, shape that has no vertices, demand a more thoughtful

approach. Due to this large difference, we chose to keep the configuration of best

success percentage (the one from Table 6).

Further investigation of the different configurations may point to a better

solution than the one chosen. A larger dataset may be needed and the results

obtained should be compared to other approaches, such as those that will be

described in Section 9.2.

Radio

Label

HorizontalZZ

Checkbox

Button

Frame

Triangle

Vertical

VerticalZZ

Shapes # 335 76 81 207 160 44 144 171 139

Success # 225 75 80 201 154 42 138 171 134

Success % 67.16% 98.68% 98.77% 97.10% 96.25% 95.45% 95.83% 100.00% 96.40%

Success

Ave. Dist.0.47 0.10 0.08 0.07 0.16 0.08 0.14 0.00 0.34

Success

Std. Dev.0.11 0.23 0.17 0.12 0.22 0.11 0.22 0.00 0.20

# 225 0 0 4 3 1 0 0 0

Ave. Dist. 0.47 0.00 0.00 0.40 0.44 0.15 0.00 0.00 0.00

Std. Dev. 0.11 0.00 0.00 0.19 0.16 0.00 0.00 0.00 0.00

# 0 75 0 0 0 0 0 0 0

Ave. Dist. 0.00 0.10 0.00 0.00 0.00 0.00 0.00 0.00 0.00

Std. Dev. 0.00 0.23 0.00 0.00 0.00 0.00 0.00 0.00 0.00

# 5 1 80 0 0 0 0 0 0

Ave. Dist. 0.63 0.50 0.08 0.00 0.00 0.00 0.00 0.00 0.00

Std. Dev. 0.07 0.00 0.17 0.00 0.00 0.00 0.00 0.00 0.00

# 90 0 0 201 154 42 6 0 0

Ave. Dist. 0.39 0.00 0.00 0.07 0.16 0.08 0.31 0.00 0.00

Std. Dev. 0.15 0.00 0.00 0.12 0.22 0.11 0.07 0.00 0.00

# 15 0 1 2 3 1 138 0 0

Ave. Dist. 0.57 0.00 0.50 0.30 0.68 0.55 0.14 0.00 0.00

Std. Dev. 0.10 0.00 0.00 0.10 0.05 0.00 0.22 0.00 0.00

# 0 0 0 0 0 0 0 171 5

Ave. Dist. 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.77

Std. Dev. 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.02

# 0 0 0 0 0 0 0 0 134

Ave. Dist. 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.34

Std. Dev. 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.20Ve

rtic

al

Zig-

Zag

Cir

cle

Ho

rizo

nta

lH

ori

zon

tal

Zig-

Zag

Re

ctan

gle

Tria

ngl

eV

ert

ical

Page 59: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

8 User evaluation study

A test with 12 participants was made to see how UISKEI compares to other

two prototyping techniques: paper prototyping and prototyping using Balsamiq.

The goal was to evaluate the difficulty not only in drawing a user interface, but

mainly in defining its interactive behavior. The evaluator asked the participants to

create and simulate a prototype of a login screen using the three different tools,

evolving the prototype through three cycles of iteration:

1st cycle Create the login screen prototype with a single

checkbox, which may lead to two different outcomes during

simulation.

2nd cycle Add another checkbox to the previous prototype,

increasing the number of possible outcomes to four.

3rd cycle Discuss about how much effort is needed to add yet

another checkbox, raising the number of possible outcomes to eight.

Figure 29: The evolution of login screens through the cycles.

The hypothesis of the test is that UISKEI should have a poor performance in

the first cycle, since its language to add elements and ECAs is unknown to most

of the participants, but then it would improve in later cycles, as participants learn

the language and benefit from having only one mock-up with coded behavior.

We expect that Balsamiq would perform well in the beginning, due to its

extensive collection of widgets and the well-known drag-and-drop paradigm, but

the need to duplicate screens to show the behavior using only navigation would

make it harder to use as complexity increases.

Page 60: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

60

According to (Hammond T. A., 2009), ―Pen and paper provide a freedom of

interaction that is still preferred to a computer automated design tool, even though

users want the sophistication of analysis and simulation capabilities of a

computer-understood diagram‖. So we expect the same pattern for paper, since the

addition of an element is extremely easy by drawing, but as the prototype evolves

and becomes more complex, some changes may require the participant to redraw

the prototype, eventually making it very difficult to perform the simulation on-

the-fly.

In the following section we present the evaluation method used. Section 8.2

presents and analyzes some results, while Section 8.3 shows some participants'

opinions expressed during the evaluation.

8.1 Evaluation method

The experiment followed a within-group design, comparing the

performances of the same participants on all three tools (paper, Balsamiq and

UISKEI), thus requiring a smaller sample than if each participant was only

exposed to a single tool. This had the negative effect, however, of them learning

from the experience of previous tools and getting better in completing the tasks

(Lazar, Feng, & Hochheiser, 2010, p. 48). To avoid the learning effect, we

randomized the order in which each participant used the tools, so the learning

effect of a user is offset by another one. Consequently, the entire data set is not

significantly biased by the learning effect (Lazar, Feng, & Hochheiser, 2010, p.

52).

Before using each tool in the first cycle, videos were shown to introduce the

tools and to explain how to add elements and define the behavior. A ―cheat sheet‖

with the main language used in UISKEI (containing Figure 18 and Table 3) was

also provided. After using each tool in cycles 1 and 2, as well as after the

discussion of the 3rd cycle, the participant was asked to answer a short

questionnaire, containing 10 grading questions, as follows:

1. How easy was it to understand what needed to be done to add the

interface elements?

(1: very hard, 5: very easy)

Page 61: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

61

2. Once you knew what to do, the effort needed to create elements was:

(1: very high, 5: very easy)

3. How different were the resulting interface and what you had in

mind?

(1: very different, 5: very easy)

4. In general, how did you like the way to create elements?

(1: hated it, 5: loved it)

5. How easy was it to plan what needed to be done to create the

required behavior?

(1: very hard, 5: very easy)

6. How efficient was the definition of the planned behavior?

(1: very inefficient, 5: very efficient)

7. How easy was it to create the new behavior?

(1: very hard, 5: very easy)

8. The definition of the new behavior required an effort:

(1: very high, 5: very easy)

9. In general, how did you like the way to define behaviors?

(1: hated it, 5: loved it)

10. Once a behavior is defined, what do you think about its

representation?

(1: hard to understand, 5: easy to understand)

All questions were formulated in a way that higher scores meant better

results. The first four questions are related to the creation of the prototype

interface, whilst the remaining questions are related to its behavior.

After doing the tasks, participants also went through a quick interview,

questioning them about which tool they would use in the situations described

below, and why:

1. In an early development stage, while exploring the idea space, where

different solutions are considered and constantly changed, focusing

only in the interface.

2. When the idea is clearer, a solution was chosen and a single

prototype needs to be built, focusing still only in the interface.

3. Considering that they needed to define the behavior of the chosen

solution.

Page 62: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

62

The complete test script can be found in ―Appendix B: Evaluation study

script―.

8.2 Evaluation results

Overall, the hypothesis of an increase in the scores given to UISKEI as the

cycles progress was confirmed by the test. After the end of the second cycle,

UISKEI only received scores lower than the other tools in questions 5 and 7,

showing that the logic behind defining ECAs is not easily grasped. By the end of

the third cycle, UISKEI‘s average scores in all questions were greater than the

other tools and all greater than 4.0. The average score of each tool in each

question can be seen in Figure 30. The complete test results can be seen in

―Appendix B: Complete test results―.

Figure 30: Average scores per question.

The lowest scores in the third cycle were given in questions 7 and 8 (both

with an average score of 4.17). Compared to the other tools, the average of

question 7 (paper had a 3.83 average while Balsamiq had a 3.58) suggests that

participants faced difficulties in handling new behaviors with all the tools, but the

answers to question 8 (in which paper got a 3.33 score and Balsamiq, 2.75) show

Page 63: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

63

that the participants considered UISKEI as a more effortless way to solve these

problems.

In the same cycle, the questions with the biggest differences from other

tools were questions 6 (+1.58 from paper and +2.33 from Balsamiq), 9 (+2.25

from paper and +2.75 from Balsamiq) and 10 (+1.92 from paper and +2.17 from

Balsamiq), all related to the interaction definition. In general, participants liked

having a single interface (contrary to the multiple ones created in Balsamiq) and a

previously defined behavior (opposed to the ―on-the-fly‖ simulation of paper).

Question 10 results also show that the mind-map representation of ECAs was well

accepted.

Another good indicator of UISKEI‘s success was the answers to the ―hated

it / loved it‖ questions. Question 4 is related to the user interface and shows that

the added complexity is quickly perceived in the paper technique, which faces an

almost steady decrease in its scores in all cycles, while Balsamiq decreases only in

the last cycle. UISKEI, on the other hand, has a steady increase of its scores,

showing that the language to add elements, once learned, is well appreciated by

users. The ―hated it / loved it‖ interaction question (question 9) showed yet

another pattern, with a steep decrease in the last cycle for both paper and

Balsamiq, while UISKEI received an almost constant score, showing that users

liked the way that the increased simulation complexity was handled.

Analyzing the question in groups (the interface building question - 1 to 4 - ,

the interaction building ones - 5 to 10 - and all the questions grouped together), it

is possible to see that UISKEI achieved good results. Moreover, Figure 31 shows

that the standard deviation of UISKEI‘s answers (the error bars in the bar graph)

was smaller than for the other tools, suggesting that the participants seemed more

in agreement when evaluating UISKEI.

Page 64: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

64

Figure 31: Average score per group of questions.

The interview results were also in favor of UISKEI, as can be seen in Figure

32. In the first question, while 25% of participants chose paper, 33% of them

chose UISKEI. This near tie indicates that UISKEI‘s sketching method is

comparable to paper, giving the desired ―paperless prototyping‖ feeling to the

participants. Balsamiq‘s results in the first two questions may be a result of its

vast library of elements and features, such as alignment options and gridlines.

However, the power of ECAs is shown in the answers to the third question, in

which the vast majority chose UISKEI over the other tools.

Figure 32: Summary of interview results.

Page 65: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

65

8.3 Participants’ opinions

Most participants did not know that it is possible to create an interactive

prototype with paper. After seeing the introductory video, some participants

already complained about the work involved. When they assumed the ―computer‖

role during the prototype simulation, the number of complaints increased. One of

the participants (p6) stated that ―using paper may cause confusion in the

‗computer‘‖. This opinion was later reinforced by another participant (p8), who

called him/herself as a ―486‖ (referencing Intel‘s older line of microprocessors)

during paper simulation, summing up that ―Paper is fun, but not much practical‖.

The overall opinion about drawing on paper was that it is good for a rough

sketch, but difficult to make changes. It can be summarized in p1‘s declaration: ―it

really complicates in the sense that you don‘t have too much flexibility once you

have already drawn something. I think that you become too restricted to what you

have or you start from the scratch, which is certainly not efficient‖. This shows a

great disadvantage of paper, since the exploration of different solutions may be

limited due to the effort of making changes in the prototype.

Regarding the paper simulation, some terms used were ―boring‖,

―disgusting‖ and ―hell.‖ The fourth participant pointed out something very

interesting, saying that ―despite being easy to do and easy to simulate, you don‘t

have the register of what was happening, unless you film it, of course, but even

so, you don‘t have the register of the used logic or even the errors that happened.‖

Balsamiq, on the other hand, divided opinions, ranging from people who

loved it and others who hated it. Amongst the ―lovers‖, the most praised features

were the smart gridlines and the overall look-and-feel of the elements, showing

that once you have these ―aesthetics‖ facilities, they turn into a major concern.

Actually, comparing the Balsamiq prototypes with the other ones, the disposition

of the elements was much more similar to the ones pictured in the script. This

focus on the detailed look-and feel was not observed in UISKEI, since, as a

sketching tool, the focus should be more in the overall interaction and structure

rather than on the ―aesthetics― (Landay & Myers, 2001).

The ―haters‖ focused their dislikes on the limitation of navigational actions,

having to duplicate the prototype due to the lack of conditionals. The participant

Page 66: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

66

p11 said that ―it is a hell having to replicate (…) if there isn‘t an ‗if‘, nothing

works‖. In comparison with UISKEI, it was considered ―less dynamic‖, as said by

p5, who simply said ―Balsamiq is static, UISKEI is dynamic‖.

Regarding UISKEI, the most common comment was about the ―learning

curve‖ and the terminology used. This visual programming problem was already

stated in the work (Schmucker, 1996), which said ―the tools for producing these

applications often require months or even years of study to use effectively, and

more often than not require the use of programming languages that are difficult to

use even for professional programmers (e.g., C++)‖.

However, the participants envisioned that, once learned, it would make the

design process easier. P11 said that ―in UISKEI, the learning curve is a little high,

a little higher than normal, but once you get it, it is piece of cake‖. Another

participant, p4, added ―it is something that you can take a while to learn, but once

you learn it, it will be way faster to use‖.

The behavior definition process fits the ―learning curve‖ observation, since

most users are not familiar with either pen-based interaction or pie menus.

Besides the difficulties, it was well accepted, as described by participant p8: ―I

found UISKEI quite cool. I liked this way of connecting events, of creating

conditions and actions, way practical and well integrated to the pen, easy to use

even without mouse and even with the pen not being so precise‖. This participant

was not the only who complained about pen interaction issues: p1 lacked the use

of keyboard, since writing recognition was not featured in Brazilian Portuguese

(language of the test). One of the participants chose to experiment the writing

recognition, translating the text to English, but found that entering text is slower

than the keyboard, result also obtained in (Frye & Franke, 2008) regarding writing

code.

The ECA representation was also praised. Comparing to Balsamiq and its

replicated interface, p4 said that ―UISKEI‘s way is more interesting, because you

can see all the conditions that are happening at the same time‖. Another

participant, p2, praised the way the ECAs were presented, saying that ―what I

liked is that it is very compact, you can work in a organized way (…) The

interface is compact, things are shown in the right place and the actions are

simple‖.

Page 67: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

9 Conclusion and Future work

UISKEI is a promising ―paperless‖ user interface prototyping application. It

explores the pen input paradigm to allow the user to draw the interface elements

and their behaviors. Besides the drawing component, the behavior definition also

features the power of conditionals and actions that go beyond the navigational

scope. This gives to the designer the possibility to easily create an interactive

prototype that he/she can later evaluate with an end user and then gather

invaluable feedback.

The interface building step relies on the drawing recognition and the

evolution of elements. The recognition uses the Douglas-Peucker simplification

algorithm and the Levenshtein string edit distance, which obtained good

recognition results and is a promising technique for customizable sketch

recognition.

The interface behavior is defined in an innovative way, expressing the

behavior using an ECA language and visualizing it with a mind-map like

representation. The conditions and actions are added by drawing lines on the

canvas and defining the parameters with the aid of a pie menu, allowing the user

to define them with a single pen stroke.

The evaluation study also showed that the participants had a very positive

opinion about UISKEI, giving higher grades to it than the other tools. UISKEI‘s

overall final score was 4.5, while Balsamiq‘s was 3.2 and Paper‘s was 3.4. Not

only the average score was one point higher than the other tools, but also the

participants were more likely to give higher grades to UISKEI in the final cycle

(with more complex interactions), since the standard deviation of the UISKEI‘s

scores was 0.65 and for the other tools, it was 1.51.

Besides all the good feedback, UISKEI is still a work in progress.

Comparing UISKEI to the related work described in Chapter 2, we can say that,

while the core functionality of the program is already available, some basic

features, such as undo/redo and cut/copy/paste (only duplicate is available), are

Page 68: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

68

missing. Advanced functionalities, like smart guidelines to aid in the drawing

phase and smart grouping, would also be great add-ons. The complete comparison

table, now with UISKEI, can be seen in Table 8, where the lines in bold highlights

the features that were the focus of this dissertation.

Table 8: UISKEI comparison with related software.

Mic

roso

ft V

isio

Axu

re R

P P

ro

Den

im

Sketc

hiX

ML

Bals

am

iq

Co

gT

oo

l

UIS

KE

I

Free n n y y n y y

UI-prototype exclusive (vs generic diagrammatic tool) n y y y y n y

UI components for multiple environments (vs web-

page prototype only)y y n y y y y

Drawing widgets n n n y n n y

>> Evolution of widgets x x x n x x y

Element manipulation y n n y y y

Undo/Redo y y y y y n

Group/Ungroup y y n y n y

>> Select internal objects y x x n x n

Cut/Copy/Paste y y y y y n

>> Copies the action n n n y n x

Zoom levels y y y y y n

Guidelines n n n y n n

Layer ordering y n n y y n

Sketchy visual n n y y y y y

Actions n y y y y y y

>> Beyond navigation x y n n n n y

>> Sketchy interaction x n y y n y y

>> Event x y y n n y y

>> Conditions x n y n n n y

Prototype evaluation x y y n y y y

Save y y y n y y y Regarding the implemented features, some features may be further

developed. A brief discussion of them will follow in the next sections, organized

by topic.

9.1 Shapes and element descriptors

At the moment, the shape descriptor only considers shapes with a single

segment. Shapes with multiple segments would greatly enhance the possibility of

having a larger element set.

Page 69: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

69

UISKEI could also have more than one library of shapes and/or elements, so

that the user can choose which ones to use and customize the element recognition.

This would allow for greater software flexibility, for example, having a library for

a desktop application prototype and another for a mobile application. However, if

the system becomes highly customizable, a tool to check for consistency (such as

not having two shapes being recognized in the same way, elements that cannot be

created due to the lack of an ancestor element or required shape or even because

another element is created instead of given a condition) will be needed.

9.2 Recognizer

The eight directions limit can be changed to consider more directions, but

the impact in the shape descriptor and in the recognition rate should be evaluated.

Also, still considering the string edit distance basis, other implementations can be

tried, such as the ones described in (Schimke & Vielhauer, 2007). For example,

(Coyette, Schimke, Vanderdonckt, & Vielhauer, 2007) uses a recognition method

that, instead of using the Douglas-Peucker simplification algorithm, uses a

uniform grid approximation approach.

If the recognition rate is still unsatisfactory, other approaches to recognition

can be tried, such as using a neural network. Since recognition tends to be

ambiguous and error-prone, the application could give the users feedback about

the recognized shape as it is being drawn, allowing the users to make necessary

changes while drawing (Tandler & Prante, 2001), or at least give the possibility to

Also, a beautification approach of user‘s rough sketches can be

implemented, such as the one proposed in (Paulson & Hammond, 2008).

9.3 ECA

Since the ECAs are evaluated in the order they appear in ECAMan, it is

necessary to give the user the option to change this order. Also, to avoid the

repetition of ECAs, a way to express AND and OR clauses in conditionals could be

Page 70: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

70

looked into. Another improvement could be to trigger more than one ECA at a

time, and the visualization of such possibility should be considered.

The limitation of an element having only one possible event should be

reconsidered, since some elements may have more than one event. For example, a

spinbox can have its value changed by the user clicking on the up or down button

or even typing the value. The ECAs should be able to handle multiple events,

giving the option of choosing a specific event (e.g., clicked or text

entered) or a more generic one (value changed, for example).

Finally, for bigger projects, keeping track of ECAs can be difficult, so a tool

to validate them and search for unreachable ECAs (because of errors in the

conditions set or because another ECA is always activated first) or duplicated ones

may be necessary.

9.4 Prototype evaluation

The prototype evaluation could take advantage of the Wizard of Oz

technique, with the final user executing the simulation in a computer and a

member of the designing team accompanying the session in another computer.

With this, the end user can have more freedom or be more constrained, since the

designer can mediate his/her interaction (Dahlbäck, Jönsson, & Ahrenberg, 1993).

Besides the usual logging possibility, this would also allow for a definition

on-the-fly of presentation units, elements and ECAs, which could be reused in

future tests.

Page 71: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

10 Bibliography

Alferes, J., Banti, F., & Brogi, A. (2006). An Event-Condition-Action Logic

Programming Language. Proceedings of the 10 th European Conference on

Logics in Artificial Intelligence (JELIA ‗06), pp. 29-42.

Bailey, J., Poulovassilis, A., & Wood, P. T. (2002). An event-condition-action

language for XML. Proceedings of the 11th international conference on World

Wide Web (WWW '02), pp. 486-495.

Barber, G. (2009, March 25). 16 Design Tools for Prototyping and Wireframing.

Retrieved July 22, 2010, from sitepoint: http://articles.sitepoint.com/article/tools-

prototyping-wireframing

Callahan, J., Hopkins, D., Weiser, M., & Shneiderman, B. (1988). An empirical

comparison of pie vs. linear menus. Proceedings of the SIGCHI conference on

Human factors in computing systems (CHI '88), pp. 95-100.

Carnegie Mellon University. (2009). CogTool. Retrieved July 28, 2010, from

Human-Computer Interaction Institute: http://cogtool.hcii.cs.cmu.edu/

Cha, S.-H., Shin, Y.-C., & Srihari, S. N. (1999). Approximate Stroke Sequence

String Matching Algorithm for Character Recognition and Analysis. Proceedings

of the Fifth International Conference on Document Analysis and Recognition

(ICDAR '99), p. 53.

Coyette, A., Faulkner, S., Kolp, M., Limbourg, Q., & Vanderdonckt, J. (2004).

SketchiXML: towards a multi-agent design tool for sketching user interfaces

based on USIXML. Proceedings of the 3rd annual conference on Task models

and diagrams , pp. 75-82.

Coyette, A., Schimke, S., Vanderdonckt, J., & Vielhauer, C. (2007). Trainable

sketch recognizer for graphical user interface design. Proceedings of the 11th

IFIP TC 13 international conference on Human-computer interaction

(INTERACT'07), pp. 124-135.

Dahlbäck, N., Jönsson, A., & Ahrenberg, L. (1993). Wizard of Oz studies: why

and how. Proceedings of the 1st international conference on Intelligent user

interfaces (IUI '93), pp. 193-200.

Davis, R. C., Saponas, T. S., Shilman, M., & Landay, J. A. (2007). SketchWizard:

Wizard of Oz prototyping of pen-based user interfaces. Proceedings of the 20th

Page 72: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

72

annual ACM symposium on User interface software and technology (UIST '07),

pp. 119-128.

Douglas, D. H., & Peucker, T. K. (1973, December). Algorithms for the reduction

of the number of points required to represent a digitized line or its caricature.

Cartographica: The International Journal for Geographic Information and

Geovisualization , Volume 10, Number 2, pp. 112-122.

Frye, J., & Franke, B. (2008). PDP: Pen Driven Programming. Proceedings of the

22nd British HCI Group Annual Conference on People and Computers: Culture,

Creativity, Interaction - Volume 2 , pp. 127-130.

Gross, M. D. (2009). Visual languages and visual thinking: sketch based

interaction and modeling. SBIM '09: Proceedings of the 6th Eurographics

Symposium on Sketch-Based Interfaces and Modeling (pp. 7-11). New Orleans,

Louisiana: ACM.

Gusfield, D. (1997). Algorithms on Strings, Trees and Sequences: Computer

Science and Computational Biology. Cambridge University Press.

Hammond, T. A. (2009). IUI'09 workshop summary: sketch recognition.

Proceedings of the 13th international conference on Intelligent user interfaces

(pp. 501-502). Sanibel Island, Florida, USA: ACM.

Hammond, T., & Davis, R. (2006). LADDER: a language to describe drawing,

display, and editing in sketch recognition. ACM SIGGRAPH 2006 Courses

(SIGGRAPH '06).

Hammond, T., Eoff, B., Paulson, B., Wolin, A., Dahmen, K., Johnston, J., et al.

(2008). Free-sketch recognition: putting the chi in sketching. CHI '08 extended

abstracts on Human factors in computing systems (CHI EA '08), pp. 3027-3032.

Hammond, T., Lank, E., & Adler, A. (2010). SkCHI: designing sketch recognition

interfaces. CHI EA '10: Proceedings of the 28th of the international conference

extended abstracts on Human factors in computing systems (pp. 4501-4504).

Atlanta, Georgia, USA: ACM.

Harrelson, D. (2009, March 24). Rapid Prototyping Tools. Retrieved July 22,

2010, from Adaptive Path Blog:

http://www.adaptivepath.com/blog/2009/03/24/rapid-prototyping-tools/

Hong, J., Landay, J., Long, A. C., & Mankoff, J. (2002). Sketch Recognizers from

the End-User's, the Designer's, and the Programmer's Perspective.

Hundhausen, C. D., Balkar, A., Nuur, M., & Trent, S. (2007). WOZ pro: a pen-

based low fidelity prototyping environment to support wizard of oz studies. CHI

'07 extended abstracts on Human factors in computing systems (CHI EA '07), pp.

2453-2458.

Page 73: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

73

Kieffer, S., Coyette, A., & Vanderdonckt, J. (2010). User interface design by

sketching: a complexity analysis of widget representations. EICS '10: Proceedings

of the 2nd ACM SIGCHI symposium on Engineering interactive computing

systems (pp. 57-66). Berlin, Germany: ACM.

Kurtenbach, G. (2010). Pen-based computing. XRDS , 16 (4), 14-20.

Landay, J. A., & Myers, B. A. (1995). Interactive sketching for the early stages of

user interface design. Proceedings of the SIGCHI conference on Human factors in

computing systems , pp. 43-50.

Landay, J. A., & Myers, B. A. (2001). Sketching Interfaces: Toward More Human

Interface Design. Computer , 34, 56-64.

Lazar, J., Feng, J. H., & Hochheiser, H. (2010). Research Methods in Human-

Computer Interaction. John Wiley & Sons Ltd.

Lin, J., Thomsen, M., & Landay, J. A. (2002). A visual language for sketching

large and complex interactive designs. Proceedings of the SIGCHI conference on

Human factors in computing systems: Changing our world, changing ourselves ,

pp. 307-314.

Paulson, B., & Hammond, T. (2008). PaleoSketch: accurate primitive sketch

recognition and beautification. Proceedings of the 13th international conference

on Intelligent user interfaces (IUI '08), pp. 1-10.

Po, B. A., Fisher, B. D., & Booth, K. S. (2005). Comparing cursor orientations for

mouse, pointer, and pen interaction. CHI '05: Proceedings of the SIGCHI

conference on Human factors in computing systems (pp. 291-300). Portland,

Oregon, USA: ACM.

Preece, J., Rogers, Y., & Sharp, H. (2002). Interaction design: beyond human-

computer interaction. John Wiley.

Schimke, S., & Vielhauer, C. (2007). Similarity searching for on-line handwritten

documents. Journal on Multimodal User Interfaces , 1 (2), 49-54.

Schmucker, K. J. (1996). Rapid prototyping using visual programming tools.

Conference companion on Human factors in computing systems: common ground

, pp. 359-360.

Segura, V. C., & Barbosa, S. D. (2008). Ferramenta de apoio ao esboço de

interfaces gráficas através da interação com caneta. Relatório de Projeto Final,

PUC-Rio, Departamento de Informática.

Segura, V. C., & Barbosa, S. D. (2009). UISK: Supporting Model-Driven and

Sketch-Driven Paperless Prototyping. Proceedings of the 13th International

Conference on Human-Computer Interaction. Part I: New Trends , pp. 697--705.

Page 74: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

74

Sezgin, T. M., Stahovich, T., & Davis, R. (2006). Sketch based interfaces: early

processing for sketch understanding. ACM SIGGRAPH 2006 Courses

(SIGGRAPH '06).

Szekely, P. A. (1994). User Interface Prototyping: Tools and Techniques.

Proceedings of the Workshop on Software Engineering and Human-Computer

Interaction (ICSE '94), pp. 76-92.

Tandler, P., & Prante, T. (2001, November). Using Incremental Gesture

Recognition to Provide Immediate Feedback while Drawing Pen Gestures. 14th

Annual ACM Symposium on User Interface Software and Technology (UIST

2001).

Tohidi, M., Buxton, W., Baecker, R., & Sellen, A. (2006a). Getting the right

design and the design right. Proceedings of the SIGCHI conference on Human

Factors in computing systems (CHI '06), pp. 1243-1252.

Tohidi, M., Buxton, W., Baecker, R., & Sellen, A. (2006b). User sketches: a

quick, inexpensive, and effective way to elicit more reflective user feedback.

Proceedings of the 4th Nordic conference on Human-computer interaction:

changing roles (NordiCHI '06), pp. 105-114.

Page 75: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

11 Appendix A: Implementing UISKEI

We developed UISKEI in C#, using Microsoft Visual Studio 2007 and the

.NET Framework 3.5. It consists of three main projects — uskModel,

uskRecognizer and uskWizard — which will be presented in the following

sections.

11.1 uskModel

This project contains all the core data and logics of an UISKEI‘s project. All

the project‘s information that needs to be saved to a file is in this project, so many

of its classes implement the C# interface ISerializable, allowing the

serialization of necessary data to a binary file. The class diagram of this project

can be seen in Figure 33.

Figure 33: Class diagram for uskModel.

Page 76: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

76

The main class of uskModel is UiskeiProject, which stores a list of

PresentationUnits, a ProjectDefaults and a ECAMan. A

PresentationUnit stores a list of abstract elements, AElementModel. For

most uses, the abstract element is sufficient: we only need to know the ―concrete‖

element to create the correct visualization and to create the group of selected

elements. All the other operations are done with abstract elements.

An element can contain a list of ElementStates, which contains a name

and may have additional parameters that are interpreted accordingly. For example,

the textbox‘s states discussed in Section 4.5 have the pattern and the sample text

as additional parameters.

The AElementModel is the base class for three different concrete classes:

GroupElementModel represents a group of elements,

therefore it has a list of AElementModels.

ScribbleElementModel represents the unidentified

drawing, therefore containing the drawing converted to a

MultipleSegments, composed of a list of Segments and a

bounding box.

DescribedElementModel represents the identified drawing,

which was converted to a widget. It contains a reference to the

ElementDescriptor which describes the widget.

As previously discussed in Section 4.3, we implemented the descriptors

hard-coded, so it is possible to see in the class diagram all the inheritance of

classes and the available elements. Besides that, the remaining code is already

prepared to handle generic descriptors, since the ―concrete descriptors‖ are private

to this project and never referenced. The only class that uses the ―concrete

descriptors‖ is the loader, ElementDescriptorLibrary. When the

descriptor language is available, we will only have to change the loader to create

descriptor instances from the language. This descriptor language should handle all

the necessary information, currently coded in the ElementDescriptors, as

shown in Figure 34.

Page 77: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

77

Figure 34: ElementDescriptor class.

The ProjectDefaults stores some default values for the project, such

as a default width and height for newly created presentation units. The ECAMan

stores all available ECAs for the project in a list, organized by the element which

triggers the event. An ECA stores the EventType which triggers the behavior,

the list of EcaConditions to be tested and the list of EcaActions to be

performed.

Since the available ECA conditions are only related to elements, the

EcaCondition class has a default constructor with two parameters: an

AElementModel and a TestType. TestType is a nested enumeration of the

available test operations, listed in Section 5.1.2.

In the other hand, there are three different types of ECA actions, so the

EcaAction was implemented as an interface. The MessageEcaAction,

ElementEcaAction and ViewEcaAction implement this interface. All

three classes have a single constructor which receives the target object (a

presentation unit in the case of MessageEcaAction, an element in the case of

ElementEcaAction and a string in the case of ViewEcaAction) and an

ActionType. Similar to the TestType, ActionType is a nested enumerator

Page 78: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

78

in each class of available operations, as listed in Section 5.1.3. An overview of the

TestType and ActionType can be seen in Figure 35.

Figure 35: Operations enums.

Also in uskModel there is the SimulationManager, which references

a ECAMan. This class controls the state of the elements — their properties and

ElementState — during a prototype evaluation session, creating a data

structure SimulationElementState to store it. An ECA is activated

through the SimulationManager, testing its conditions and executing its

actions according to the SimulationElementStates. As discussed in

Section 6, since the manager only references the ECAMan, changes in ECAs are

reflected on-the-fly during the prototype evaluation session.

The Common class contains a list of common methods used in various

projects. The ResizePoint class represents the eight handles or points that

appears onscreen when the user tries to resize an element. Since depending on the

manipulated point the resize result is different, this is a parameter of the resize

method of an element. When no ResizePoint is specified, it is considered that

the resize occurs in the SouthEast direction.

11.2 uskRecognizer

This project is the one responsible for recognizing the user‘s drawings. It

handles the MultipleSegments and Segment data structures and evaluates

Page 79: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

79

the Shape associated to it. The classes that compose this project can be seen in

Figure 36.

Figure 36: Class diagram for uskRecognizer.

The main exported class is the ShapeRecognizer. It has two methods,

GetBestShape and GetElementDescriptorFromShape. The first

method receives the MultipleSegments to be recognized and returns the

structure BestShapeResult, which references the shape with the least

distance and the distance value.

After associating the MultipleSegments to a shape, the second method

is called. It receives the MultipleSegments, the BestShapeResult and

the element in which the drawing was made (or null otherwise). It returns a data

structure ElementDescriptorResult, which contains the descriptor (or

null if not recognized) and how the element can be created (if it is a new

element or an evolved element).

The recognition process was divided in two methods for two reasons. First,

for test purposes, since the association to a shape was dissociated from the

recognition as an element. Second because since several drawings may enter the

recognition process, the association to a shape can happen only one time and the

element recognition may happen iteratively. For example, if the user draws

several rectangles in sequence and in the end draws a line in the first one, the first

one should be converted into a textbox and the others, to a button. So, the

association to a shape may happen only in the beginning (several rectangles and a

Page 80: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

80

horizontal line) but the element recognition must happen iteratively (so the

rectangle is first recognized as a button and later as a textbox).

The GetBestShape method uses the ShapeLibrary to go through the

list of loaded shapes. A StringShape is defined as explained in Section 4.2 and

loaded from a text file with a .shp extension.

The other classes are internal for the project. RecognizerCommon

contains common methods used in the project. RecognizerArgs is a structure

that associates a Segment to its description as a string of directions and is used

during the association to a shape step. Finally, the DouglasPeucker class is

responsible for the Douglas-Peucker algorithm of simplifying line segments as

discussed in Section 4.4.

There are a number of variations related to the recognition process that

needs to be defined. The Douglas-Peucker simplification algorithm may or may

not be used and, if used, we need to establish a value for tolerance. Then, when

the segment is being converted to a string, we need to define how far apart the

points must be in order to be converted into a character direction. Finally, with the

segment string, we need to define which will be the costs used in the Levenshtein

edit distance algorithm. An experimental test to determine these variations is

detailed in Chapter 7.

11.3 uskWizard

The last project is uskWizard. It contains the graphical user interface of

UISKEI, entirely built using XAML. The project‘s classes can be seen in Figure

37.

Page 81: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

81

Figure 37: Class diagram for uskWizard.

The MainWindow is composed by several controls — MenuControl,

PresentationUnitControl, ECAManControl, ToolbarControl,

ElementControl — and the WizardView, a UserControl with a

WizardCanvas. Since the diagram generated by Visual Studio lacks the

representation of GUI composition, the classes were displayed in a tree-like

fashion in Figure 37 to illustrate this composition.

Page 82: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

82

Each control references the same WizardController, object that keeps

the current state of the interface, for example, the current project, the current

presentation unit, the selected elements, etc. Since all controls references the same

object, they are kept in sync.

The WizardCanvas is where the presentation units are displayed and the

pen-based interaction occurs. It contains a PresentationUnitFilmstrip

(shown in Figure 25), composed of a list of thumbnails,

PresentationUnitThumbnails. Each thumbnail contains a scaled

PresentationUnitView, which is the representation of a presentation unit.

Each PresentationUnitView contains a list of AElementViews, which,

similarly, are the representation of an AElementModel. The

ElementViewFactory is responsible for creating the correct concrete

implementation of AElementView.

This architecture does not use the FrameworkElement paradigm, so we

had to explicitly handle the Draw event and also the mouse events (e.g. click,

enter, leave), having total control of how it worked. Also, since there is no

reference to parent objects, we were able to use the same

PresentationUnitView in the filmstrip and in the canvas, reducing the

number of objects in memory.

The MainWindow also controls the current mode of UISKEI, between four

possible modes: drawing mode, recognition mode, ECA mode and simulation

mode. The first three modes changes how the designer interacts with the canvas,

while the last one creates a new dialog.

The drawing mode and the recognition mode are essentially the same, only

having to recognize the drawings or not. So, the DrawingMode class is used for

both and a boolean passed as a parameter in the constructor sets the recognition

process on or off. The ECA mode is handled by the EcaMode class.

The modes that changes the interaction with the canvas are abstracted with

the IMode interface. When an IMode is set on the canvas, the previous one is

deactivated and the new one is activated, so the modes can connect/disconnect to

the canvas‘ events needed. For example, for the DrawingMode, the

StrokeCollected event is important and the MouseMove is not, while for

Page 83: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

83

the EcaMode is the opposite: the MouseMove event is important and the

StrokeCollected is not.

The last mode, simulation mode, displays a new window with the interactive

prototype for evaluation purposes. Since one of the future works idea is to have a

Wizard of Oz approach to the simulation (as will be discussed in Section 9.4), the

simulation user dialog was called Dorothy and it is composed by

ElementViews. To investigate how the FrameworkElement paradigm

works, the ElementView inherits from FrameworkElement.

Other smaller dialogs also are part of uskWizard. The

DefaultTextboxDialog is a standard dialog with a instructions message and

a textbox, used to input the text of a message action. The

ProjectDefaultsDlg configures the uskModel.ProjectDefaults of

the current project. The StatesEditorDlg is the dialog to edit element‘s

states. By the time, it contains only a textbox, but it can be improved to show a

data grid with on-the-fly validation, for example.

Page 84: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

12 Appendix B: Evaluation study script

The test comparing the three prototyping techniques took place in the

beginning of February / 2011 and recruited 12 participants. To each one was

presented the script below, freely translated from Portuguese (the participants‘

native language), keeping its format, organization and structure.

12.1 Description

This study aims to compare three different GUI prototyping techniques

(paper prototyping, prototyping using UISKEI and prototyping using Balsamiq),

evaluating how they can be applied to a use case described through the test. It is

divided in stages, described below:

Initial stage Only a questionnaire about your familiarity with the

use of computers and prototyping tools.

1st and 2nd cycles Each one is further divided in three parts:

o Scenario A story to motivate the tool usage.

o Task The activity to be done related to the scenario, using

all three techniques. In the first cycle, a video will be show to

introduce you to the tools.

o Questionnaire Three questionnaires will be presented in

each cycle, one after the use of each tool during task

execution, to obtain your opinion about each technique.

Final cycle An additional scenario will be shown, but the task

will not need to be performed: we will only talk about how you

would perform it. After the discussion, the same three questionnaires

should be filled and a quick interview will take place.

Page 85: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

85

If you wish to be a test participant, in order to guarantee your anonymity

and privacy rights, and to assure the adequate use of the collected data, you will

need to sign an informed consent form.

IMPORTANT: What is being evaluated are the tools, NOT you. There is

not a ―right way‖ to perform each task. The difficulties and facilities that each tool

provide in each task are the main focus of the test.

12.2 Questionnaire

1) How often do you use computers?

4 - Many times a day

3 - At least once a day

2 - At least once a week

1 - Once in a while

0 - Never

2) Do you know any programming language?

0 - No (go to question 3)

1 - Yes, Which ones? ___________________________________________

a) With which ones are you most familiarized? ________________________

b) How often do you use this programming language?

4 - Many times a day

3 - At least once a day

2 - At least once a week

1 - Once in a while

c) How would you classify your knowledge about this programming

language?

4 - I know the language deeply and I can do everything I want with it

3 - I have a fair knowledge of the language, but sometimes I need to

learn a little more

2 - I know little about the language, knowing only its basic usage

1 - Still learning the basics

Page 86: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

86

3) Have you already had to design a graphical user interface (an application

window, a web page, for example), i.e., plan how the interface elements are

arranged and how the interface ―works‖?

0 - No (end of questionnaire)

1 - Yes

a) How often have you had to make an interface design?

3 - Practically every week

2 - Sometimes a month

1 - Sometimes a year

b) How do you usually elaborate the designs? (you may mark more than one

option)

Using paper drawings

Using a generic tool

Image editing software (Paint, Photoshop, GIMP, etc)

Power Point

HTML

Others? _____________________________________________

Using a specific prototyping tool

Visio

Balsamiq

Dreamweaver

Specific language designer (Qt Designer, Expression Blend, etc)

Others? _____________________________________________

Implementing in code

4) Have you ever used any of the prototyping tools to be evaluated in this test?

0 - No (end of questionnaire)

1 - Yes, please fill in the following table:

Page 87: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

87

Paper Balsamiq UISKEI

How would you classify your

knowledge of this tool? 4 - I know the tool deeply and I can do

everything I want with it

3 - I have a fair knowledge of the tool, but

sometimes I need to learn a little more

2 - I know little about the tool, knowing

only its basic usage

1 - Still learning the basics

0 - I don‘t know / Never used

How often do you use this tool? 4 - Many times a day

3 - At least once a day

2 - At least once a week

1 - Once in a while

0 - Never

12.3 First Cycle

12.3.1. Scenario 01

You are developing an e-commerce application and wish to plan how the

checkout process will occur after the products are in the cart. You imagine a

system where the user must identify him/herself to the system (entering a

username and a password, for example), informing whether he/she is already a

registered user. If he/she is already a registered user, he/she will see the

―Checkout of registered user‖ screen, where he/she can choose the payment

information (as address and credit card) previously used. If he/she is not a

registered user, he/she will see the ―Register user‖ screen, in which he/she must

enter other information, as e-mail, password confirmation, etc.

12.3.2. Task 01

Sketch the screen below, related to the presented scenario, in the available

three tools and in the established order, as you would present the planned

prototype to a client.

Page 88: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

88

After drawing the screens, simulate the prototype, exploring all the

alternative interaction paths.

Ps.: The resulting screens (―Checkout of registered user‖ and ―Register

user‖) don‘t need to be sketched. They are only needed to distinguish which

would be the resulting screen when pressing the button.

12.4 Second Cycle

12.4.1. Scenario 02

Talking to a friend of yours, he shows himself outraged for being obliged to

make a registration in a web store where he plans to never buy again. To avoid

this kind of dissatisfaction of your future clients, you plan to add to that initial

screen an option of ―Buy without registration‖. If the client activates this option,

the client identification fields (login and password) will be disabled. Depending of

the chosen options, the client could go to different screens, described below:

“Buy without

registration”

“Already a

registered user” Result

―Register user‖

―Checkout of registered user‖

―Checkout without register‖

―Error 1‖ (client informed that doesn‘t want

to buy with a registration and that he/she is

a registered user)

12.4.2. Task 02

Sketch the screen below, related to the presented scenario, in the available

three tools and in the established order. Then, demonstrate how you would test the

prototype with a user, showing how you would do to display that the checkbox

Page 89: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

89

―Buy without registration‖ determines whether the fields ―Login‖ and ―Password‖

are enabled/disabled and how the combination of this checkbox with the other one

can lead to 4 different results.

12.5 Third Cycle

12.5.1. Scenario 03

Navigating in competitors websites, you discover that there is a ―Facebook

connect‖ feature, allowing a site to use the user‘s Facebook registration to identify

him/herself in other sites. To integrate this possibility into your website, you

change the text from ―Login‖ to ―E-Mail‖ and add a new option to indicate

whether the information provided is from the site or from the Facebook. With this

new option, the possible paths are:

“Facebook

Connect”

“Buy without

registration”

“Already a

registered

user”

Result

―Register user‖

―Checkout of registered user‖

―Checkout without register‖

―Error 1‖ (Without registration X

Registered user)

―Register user using Facebook‖

―Checkout of Facebook registered

user‖

―Error 2‖ (Without registration X

Facebook)

―Error 3‖ (Without registration X

Registered with Facebook)

Page 90: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

90

12.5.2. Task 03

Talk about what modification you would make in the prototypes and how

you would test them, using the available three tools and in the established order.

Try talking about how each tool makes the process of building the prototype and

its behavior easier or harder.

Page 91: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

13 Appendix B: Complete test results

The tables in this appendix show the data collected from the evaluation

study for every participant, represented by p<n> (the fifth participant is named

p5, for example). The following table shows the tool presentation order, where P

stands for paper, B stands for Balsamiq and U stands for UISKEI. It is possible to

see that every combination (6 in total) was performed with two different

participants.

Table 9: Tools presentation order.

p1 p2 p3 p4 p5 p6 p7 p8 p9 p10 p11 p12

Ord

er P 1 1 2 3 2 3 1 3 1 2 3 2

B 2 3 1 2 3 1 3 1 2 1 2 3

U 3 2 3 1 1 2 2 2 3 3 1 1

The next table shows the answers to the initial questionnaire presented to

the participants. The complete questionnaire script and the answer code can be

seen in Section 12.1.

Table 10: Questionnaire answers.

p1 p2 p3 p4 p5 p6 p7 p8 p9 p10 p11 p12

Qu

esti

on

nai

re

1 4 4 4 4 4 4 4 4 4 4 4 4

2 1 1 1 1 1 1 1 1 1 1 1 1

2b 2 4 1 1 4 3 2 4 3 2 4 4

2c 2 3 2 2 3 3 3 4 4 2 3 3

3 0 1 1 1 1 1 1 1 1 1 1 1

3a x 1 2 1 1 3 1 1 2 2 1 3

4 1 0 1 0 0 1 1 1 1 0 0 1

4p1 2 x 2 x x 3 0 2 4 x x 4

4p2 0 x 1 x x 1 0 1 1 x x 1

4b1 0 x 0 x x 0 0 0 0 x x 0

4b2 0 x 0 x x 0 0 0 0 x x 0

4u1 1 x 0 x x 2 2 0 0 x x 0

4u2 0 x 0 x x 1 1 0 0 x x 0

Page 92: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

92

The following tables show the scores given by the participants in the

questionnaire presented after each cycle (the questions can be found in Section

8.1). Also, they present the average score and standard deviation for each question

and group of questions (grouped in user interface questions and interaction

questions, as discussed in Section 8.2).

Table 11: First cycle questionnaire answers and statistics.

p1 p2 p3 p4 p5 p6 p7 p8 p9 p10 p11 p12

Ave

Std Dev

Ave Std Dev

1st

Cyc

le

Pap

er

1 5 5 5 5 5 5 5 3 5 5 5 5

4.83 0.58

4.33 0.86 2 5 4 4 5 3 5 4 5 4 4 4 5

4.33 0.65

3 5 3 5 5 4 5 3 5 5 2 4 5

4.25 1.06

4 5 5 5 4 2 4 4 4 3 4 3 4

3.92 0.90

5 3 2 5 5 4 5 4 5 3 4 5 5

4.17 1.03

3.74 1.22

6 2 4 5 5 2 5 4 2 4 3 3 1

3.33 1.37

7 5 5 5 5 4 5 4 2 4 3 3 4

4.08 1.00

8 5 5 5 5 3 5 4 3 5 5 4 4

4.42 0.79

9 3 3 5 4 2 3 4 2 3 4 3 1

3.08 1.08

10 5 3 5 3 2 5 3 1 3 4 5 1

3.33 1.50

Bal

sam

iq

1 4 3 4 5 5 4 5 5 5 4 5 4

4.42 0.67

4.25 0.89 2 3 4 5 5 3 5 5 5 5 2 5 4

4.25 1.06

3 3 5 4 5 2 5 5 5 4 4 5 5

4.33 0.98

4 3 4 4 5 2 4 5 4 4 4 5 4

4.00 0.85

5 2 2 5 3 3 4 2 2 2 3 4 3

2.92 1.00

2.97 1.10

6 1 3 2 4 2 2 3 3 3 1 2 2

2.33 0.89

7 2 4 5 3 2 3 4 5 4 2 3 4

3.42 1.08

8 2 4 4 4 4 4 4 4 4 3 3 3

3.58 0.67

9 2 3 4 3 2 2 4 2 2 1 1 3

2.42 1.00

10 4 3 5 5 2 4 4 4 3 1 2 1

3.17 1.40

UIS

KEI

1 5 3 3 5 2 4 4 4 5 3 5 4

3.92 1.00

4.02 1.00 2 5 3 2 5 3 3 3 4 5 2 5 3

3.58 1.16

3 5 4 4 5 5 5 4 5 3 4 4 5

4.42 0.67

4 5 5 2 5 3 3 4 5 5 4 5 4

4.17 1.03

5 2 3 3 5 4 5 2 5 4 4 5 5

3.92 1.16

3.94 1.06

6 4 4 3 5 5 5 3 5 5 3 5 4

4.25 0.87

7 4 3 3 4 3 5 2 4 4 3 5 4

3.67 0.89

8 4 3 2 3 4 5 2 5 5 2 5 5

3.75 1.29

9 5 4 2 5 3 4 3 5 5 3 5 4

4.00 1.04

10 5 5 1 5 4 5 4 3 4 4 5 4

4.08 1.16

Page 93: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

93

Table 12: Second cycle questionnaire answers and statistics.

p1 p2 p3 p4 p5 p6 p7 p8 p9 p10 p11 p12

Ave Std Dev

Ave Std Dev

2n

d C

ycle

Pap

er

1 5 5 5 5 5 4 5 3 4 5 5 5

4.67 0.65

3.83 1.10 2 3 5 3 5 3 5 3 2 5 4 3 5

3.83 1.11

3 3 4 2 5 2 5 3 5 4 1 4 4

3.50 1.31

4 3 3 5 4 3 4 4 3 3 3 2 3

3.33 0.78

5 5 2 5 5 3 5 4 4 2 4 5 5

4.08 1.16

3.35 1.34

6 2 2 4 5 2 5 3 2 4 3 2 1

2.92 1.31

7 4 4 4 5 4 5 4 1 4 5 3 3

3.83 1.11

8 4 4 3 5 3 5 3 2 3 5 3 1

3.42 1.24

9 3 2 5 4 2 4 4 1 3 5 2 1

3.00 1.41

10 5 2 4 4 2 4 3 1 2 1 5 1

2.83 1.53

Bal

sam

iq

1 5 5 5 5 4 5 5 5 5 4 5 4

4.75 0.45

4.31 0.90 2 4 5 5 5 3 5 3 5 4 4 5 3

4.25 0.87

3 5 5 5 5 2 5 5 4 4 4 5 2

4.25 1.14

4 2 4 5 5 4 4 5 4 3 4 5 3

4.00 0.95

5 5 2 5 5 4 4 5 3 4 4 4 5

4.17 0.94

3.35 1.42

6 1 3 5 5 2 4 4 2 1 2 2 1

2.67 1.50

7 4 5 5 5 4 5 4 4 3 5 2 4

4.17 0.94

8 4 4 5 5 2 5 3 5 2 4 2 1

3.50 1.45

9 2 2 5 4 2 3 5 1 2 1 1 1

2.42 1.51

10 2 3 5 5 3 4 4 4 2 3 2 1

3.17 1.27

UIS

KEI

1 5 5 5 5 5 5 4 3 5 5 5 5

4.75 0.62

4.58 0.65 2 5 4 4 5 4 4 5 5 5 3 5 4

4.42 0.67

3 5 5 5 5 5 5 4 5 4 4 5 5

4.75 0.45

4 5 5 5 5 4 3 4 5 4 3 5 5

4.42 0.79

5 2 3 4 4 4 5 3 4 3 3 5 5

3.75 0.97

4.14 0.81

6 5 4 5 4 4 5 3 4 4 3 5 5

4.25 0.75

7 4 3 5 3 4 5 4 3 4 3 5 4

3.92 0.79

8 4 4 5 4 3 4 4 3 5 3 5 4

4.00 0.74

9 5 4 5 5 4 4 3 5 4 3 5 5

4.33 0.78

10 5 5 5 5 5 5 4 4 3 4 5 5

4.58 0.67

Page 94: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

94

Table 13: Third cycle questionnaire answers and statistics.

p1 p2 p3 p4 p5 p6 p7 p8 p9 p10 p11 p12

Ave Std Dev

Ave Std Dev

3rd

Cyc

le

Pap

er

1 5 5 5 5 5 5 5 3 5 5 5 5

4.83 0.58

3.67 1.36 2 1 3 3 5 3 5 2 2 4 5 3 5

3.42 1.38

3 5 3 3 5 2 5 3 5 2 1 4 5

3.58 1.44

4 1 2 4 4 2 4 4 3 2 2 2 4

2.83 1.11

5 5 2 5 5 4 5 4 2 4 5 5 5

4.25 1.14

3.18 1.58

6 1 2 3 5 2 5 2 1 5 5 2 1

2.83 1.70

7 4 3 5 5 3 5 4 2 5 5 1 4

3.83 1.34

8 4 3 4 5 3 5 2 2 5 5 1 1

3.33 1.56

9 2 1 4 3 1 4 3 1 3 2 1 1

2.17 1.19

10 5 1 4 3 2 5 3 1 1 1 5 1

2.67 1.72

Bal

sam

iq

1 5 5 4 5 5 3 5 5 5 4 5 5

4.67 0.65

3.88 1.28 2 4 5 2 5 3 4 1 3 3 4 5 2

3.42 1.31

3 5 5 5 5 1 5 5 3 2 4 5 2

3.92 1.51

4 2 4 3 5 1 4 5 3 3 4 5 3

3.50 1.24

5 5 4 5 5 4 4 4 2 3 5 4 5

4.17 0.94

2.78 1.49

6 1 4 1 5 2 3 1 2 1 2 2 1

2.08 1.31

7 4 5 2 5 3 5 4 3 3 4 1 4

3.58 1.24

8 4 4 2 4 3 5 1 3 2 3 1 1

2.75 1.36

9 1 2 1 4 2 3 2 1 1 1 1 1

1.67 0.98

10 1 4 1 5 1 3 3 3 1 5 1 1

2.42 1.62

UIS

KEI

1 5 5 5 5 5 5 5 3 5 5 5 5

4.83 0.58

4.73 0.54 2 5 4 4 5 4 5 4 5 5 5 5 5

4.67 0.49

3 5 5 5 5 5 5 4 5 3 5 5 5

4.75 0.62

4 5 5 5 5 5 4 4 5 5 4 5 4

4.67 0.49

5 4 4 5 4 5 5 4 4 5 4 5 5

4.50 0.52

4.38 0.68

6 5 4 4 4 5 5 5 4 4 3 5 5

4.42 0.67

7 5 4 4 3 5 5 4 4 4 3 5 4

4.17 0.72

8 5 4 5 3 5 5 4 3 4 3 5 4

4.17 0.83

9 5 4 4 5 5 4 3 5 4 4 5 5

4.42 0.67

10 5 5 5 5 5 5 4 4 3 4 5 5

4.58 0.67

Page 95: Vinícius Costa Villas Bôas Segura UISKEI: Sketching the ...simone/files/VSegura2011.pdfand widgets (enabling/disabling widgets, for example). This dissertation presents the main

95

Lastly, the following table shows the answers to the interview questions

(presented in section 8.1), with the same letter code previously used (p=Paper,

b=Balsamiq, u=UISKEI). The last columns show the number of participants who

chose the corresponding tool in that question.

Table 14: Interview answers.

p1 p2 p3 p4 p5 p6 p7 p8 p9 p10 p11 p12

P B U

Inte

rvie

w 1 B U P B U P B P U U B B

3 5 4

2 B B B U U B B B B B B B

0 10 2

3 U U U U U U P U U U U U

1 0 11