building a privacy gardian for the electronic age · page 1 / 139 building a privacy gardian for...

139
Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable type Public Usage Deliverable number D23 Release number 01 (Alpha) -- 2003-02-06 Contractual date of delivery January 31, 2003 Actual date of delivery January 31, 2003 Title of deliverable D23: Agent User Interfaces and Documentation Work package 5.3 Nature of the deliverable Report Author(s) Andrew S. Patrick Stephen Kenny Organisation NRC (formerly) Netherlands Data Protection Authority Abstract This is the first report of the Human-Computer Interaction (HCI) research and development activities being conducted in the PISA project. The goals was to build an agent-based service that people will trust with sensitive, personal information and one that will operate according to privacy-protection requirements coming from legislation and best practices. Two different research activities were conducted. The first was to carefully examine the concept of "trust" and develop requirements and guidelines for building trustworthy agent systems. The second was to examine the privacy legislation and principles to determine the human-factors implications and consequences. The result was a process that begins with privacy legislation, works through derived privacy principles, examines the HCI requirements, and ends with specific interface design solutions. Following this general research, specific system design requirements for the PISA Demonstrator were developed. Further, in order to demonstrate the HCI design concepts and kick-start the interface design portion of the project, a stand- alone interface demonstration was developed and demonstrated. Finally, planning has begun for usability evaluations to be conducted in 2003 that will test the PISA interface and the completed system. Keyword list user interface, human factors, human-computer interaction, trust

Upload: others

Post on 08-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

Page 1 / 139

BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE

Project number IST - 2000 - 26038Project title PISADeliverable type Public Usage

Deliverable number D23Release number 01 (Alpha) -- 2003-02-06Contractual date of delivery January 31, 2003Actual date of delivery January 31, 2003Title of deliverable D23: Agent User Interfaces and DocumentationWork package 5.3Nature of the deliverable ReportAuthor(s)

Andrew S. PatrickStephen Kenny

OrganisationNRC(formerly) Netherlands DataProtection Authority

AbstractThis is the first report of the Human-Computer Interaction (HCI) research and development activitiesbeing conducted in the PISA project. The goals was to build an agent-based service that people willtrust with sensitive, personal information and one that will operate according to privacy-protectionrequirements coming from legislation and best practices. Two different research activities wereconducted. The first was to carefully examine the concept of "trust" and develop requirements andguidelines for building trustworthy agent systems. The second was to examine the privacy legislationand principles to determine the human-factors implications and consequences. The result was a processthat begins with privacy legislation, works through derived privacy principles, examines the HCIrequirements, and ends with specific interface design solutions. Following this general research,specific system design requirements for the PISA Demonstrator were developed. Further, in order todemonstrate the HCI design concepts and kick-start the interface design portion of the project, a stand-alone interface demonstration was developed and demonstrated. Finally, planning has begun forusability evaluations to be conducted in 2003 that will test the PISA interface and the completed system.

Keyword list user interface, human factors, human-computer interaction, trust

Page 2: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 2 / 139

Executive Summary

This is the first report of the Human-Computer Interaction (HCI) research and development activitiesbeing conducted in the PISA project. The goal of the HCI activities in the PISA project is "tounderstand and develop technologies that will improve human interaction with the distributed virtualmarketplace that is emerging in the information and communications industries" (from the originalPISA Description of Work). More specifically, the task is to build an agent-based service that peoplewill trust with sensitive, personal information and one that will operate according to privacy-protectionrequirements coming from legislation and best practices.

To meet these goals, two different research activites have been conducted. The first was to carefullyexamine the concept of "trust" and review what is known about building trustworthy systems. It wasfound that intelligent, autonomous agents have the potential to facilitate complex, distributed tasks andprotect users' privacy. However, building agents users will trust with personal and sensitive informationis a difficult design challenge. Agent designers must pay attention to human factors issues that areknown to facilitate feelings of trust. These include providing transparency of function, details ofoperation, feedback, and predictability. They must also consider factors that lead to feelings of risktaking. This means reducing uncertainty, collecting the minimal amount of information, and carefullyconsidering the amount of autonomy an agent will have. These guidelines are being used in thedevelopment of the PISA Demonstrator, and they are also of interest to other system designers.

The second research activity was to examine the privacy legislation and principles to determine thehuman-factors implications and consequences. The goal of this work was to document a process thatbegins with privacy legislation, works through derived privacy principles, examines the HCIrequirements, and ends with specific interface design solutions. This research involved a phrase-by-phrase analysis of the European Privacy Directive (95/46/EC) to determine the human behaviorrequirements that were implied by the legal constructs. Interface design techniques were then outlinedfor each of the requirements, and specific design solutions were developed for the PISA Demonstrator.The result is a set of recommendations for implementing "usable compliance" with privacy legislationand principles. For the first time, this work specified what must be included in human-computerinterfaces to satisfy the spirit of European privacy legislation and principles, and satisfy the privacyneeds of the users ("usable compliance").

These research activities have led to the publication of a journal article on building trustworthy agentsand a workshop submission on deriving HCI requirements and usable compliance from privacyprinciples and best practices.

Following this general research, specific system design requirements for the PISA Demonstrator weredeveloped. Some of these requirements are related to building trustworthy interfaces, and they includeare features that should be present in the interface, supporting information that is required (i.e., help,documentation), overall system characteristics and capabilities, and system performance issues. Someof the requirements come from the analysis of privacy legislation and principles. These requirementsare organized around user tasks, and include registering and agreeing to service terms, creating a task,tracking a task, modifying a task, and dealing with system messages.

The next step was to determine how these requirements could be met in the PISA Demonstrator. A newtechnique called "Privacy Interface Analysis" was developed to describe how UML models can becreated from a HCI point of view, and their role in the developing trustworthy, privacy-protecting

Page 3: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 3 / 139

interfaces. This work involved creating and anotating a UML sequence diagram for each use case thatwas expected in the application. For the PISA project, this technique was quite successful in makingassumptions concrete and stimulating discussions within the development team.

In order to demonstrate the HCI design concepts developed for PISA, and kick-start the interface designportion of the project, a stand-alone interface demonstration was developed at NRC. The goal was todevelop a set of WWW pages and back-end applications that would demonstrate a "look and feel" forthe PISA demonstrator before the actual agent platform was available. This prototype as designed to beboth stand-alone, so it could be demonstrated and tested, and also modular and well structured, so itcould be integrated into the final PISA demonstrator. These design goals have been met. A stand-aloneinterface demonstration is available and the interface concepts and page designs have been integratedinto the main PISA Demonstrator.

Planning has begun for usability evaluations of the PISA interface. Following the original project plan,there will be two rounds of usability evaluations. The first round, to take place early in 2003, will testthe interface concepts and implementation in the interface prototype, and the second, to take place in thefall of 2003, will be an evaluation of the complete PISA Demonstrator.

Page 4: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 4 / 139

PISA, Project InformationContributionPISA contributes to key action lines of the IST-programme of the EC:

II4.1: “ To develop and validate novel, scalable and interoperable technologies,mechanisms and architectures for trust and security in distributed organisations, servicesand underlying infrastructures”. II4.2: To scale-up, integrate, validate and demonstrate trust and confidence technologiesand architectures in the context of advanced large-scale scenarios for business andeveryday life. This work will largely be carried out through trials, integrated test-beds andcombined RTD and demonstrations.

GoalThe objectives of the PISA-project are:

Demonstration of PET as a secure technical solution to protect the privacy of the citizenwhen he/she is using Intelligent Agents (called shopbots, buybots, pricebots or just "bots", ashort for robot1) in E-commerce or M-commerce applications, according to EC-Directiveson Privacy.

Interaction with industry and government to launch new privacy protected services. ThePISA-project will produce a handbook on Privacy and PET for ISAT and a PISA-agent asshareware. Also a plan for the dissemination of the results of PISA will be produced.

Propose a standard for Privacy Protected Agent Transactions to Standardisation Bodies.

ResultsPISA contributes at building a model of a software agent within a network environment, todemonstrate that it is possible to perform complicated actions on behalf of a person, without thepersonal data of that person being compromised. In the design of the agent an effectiveselection of the presented Privacy Enhancing Tchnologies (PET) will be implemented. Welabel this product as a Privacy Incorporated Software Agent (PISA).The PISA demonstration model is planned to be a novel piece of software that incorporatesseveral advanced technologies in one product:• Agent technology, for intelligent search and matching ;• Data mining or comparable techniques to construct profiles and make predictions;• Cryptography for the protection of personal data, as well as the confidentiality of

transactions.

Additionally the project involves:• Legal expertise to implement the European privacy legislation and the needed development

of new rules and norms;• System design knowledge to turn legal boundary condition into technical specifications;• Advanced software-programming skills to implement the privacy boundary conditions.

In order to prove the capability of the PISA-model, we propose to test it in a modelenvironment in two cases in e-commerce that closely resembles a real-life situation.

1 In E-commerce, “Bots” will slug It Out for Us; International Herald Tribune, 21 August 2000

Page 5: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 5 / 139

PISA Project Consortium

• TNO-FEL Physics and Electronics LaboratoryOude Waalsdorperweg 63P.O. Box 96864, 2509 JG The Hague, The NetherlandsProject co-ordination, Privacy Enhanced TechnologiesTNO-TPD Institute of Applied PhysicsStieltjesweg 1P.O.Box 155, 2600 AD Delft, The NetherlandsIntelligent Software Agents Platform and PISA-demonstrator

• Netherlands Data Protection AuthorityPrins Clauslaan 20Postbus 93374, 2509 AJ The Hague, The NetherlandsPrivacy Protection and Legal Issues

• Delft University of Technology, Faculty of InformatonTechnology and Systems, Information Theory GroupMekelweg 4P.O. Box 5031, 2600 GA Delft, The NetherlandsCryptography

• Sentient Machine ResearchBaarsjesweg 2241058 AA Amsterdam, The NetherlandsData Mining, Data Matching and Cases

• FINSA Consulting, Italsoft.52, Corso del Rinascimento, 00186 Rome, ItalyIntelligent Software Agents and Multimedia Development

• National Research Council CanadaInstitute for Information TechnologyMontreal Road, Building M-50Ottawa, Ontario Canada K1A 0R6Network, Scalability and User Interfaces

• GlobalSignHaachsesteenweg 1426 Chaussee de Haecht1130 Brusses, BelgiumPublic Key Infrastructure

Page 6: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 6 / 139

AcknowledgementsThe authors would like to thank the entire PISA team for their helpful contributions, comments, andideas throughout this research program. In particular, we would like to thank Kathy Cartrysse, AlfredoRicchi, and Martijn van Breukelen for their detailed comments on the prototype interface.

Andrew Patrick would also like to give special acknowledge to Richard Gerrard. Richard was a co-operative education student from Carleton University who spent eight months working on this project.Richard was instrumental in executing and implementing the ideas that developed during the project,most notably the HCI UML diagrams and the coding of the prototype interface.

Page 7: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 7 / 139

Table of Contents

1. Introduction.............................................................................................................................11

2. Building Trustworthy Software Agents ..................................................................................122.1 Trust and Agents......................................................................................................122.2 Concerns About Agents...........................................................................................122.3 The PISA Reference Case: The Job-Searching Agent ...........................................132.4 Agents and Trust......................................................................................................14

2.4.1 What is Trust? ....................................................................................................142.4.2 The Problem of Trusting Agents: Interactions Twice-Removed .......................15

2.5 Building Successful Agents: A Summary Model...................................................162.5.1 Factors Contributing to Trust .............................................................................172.5.2 Factors Contributing to Perceived Risk .............................................................23

2.6 Checking Your Work: Human-Factors Evaluation Techniques .............................242.7 Conclusions .............................................................................................................25

3. From Privacy Legislation to Interface Design ........................................................................263.1 Privacy Guidelines and Legislation.........................................................................273.2 Methods for Privacy Protection...............................................................................273.3 Related Work...........................................................................................................273.4 Privacy Principles....................................................................................................28

3.4.1 EU Legislation ...................................................................................................283.4.2 Overview of the Resulting Principles ................................................................29

3.5 HCI Requirements ...................................................................................................293.6 Interface Methods to Meet Requirements ...............................................................33

3.6.1 Comprehension ..................................................................................................333.6.2 Consciousness ....................................................................................................353.6.3 Control ...............................................................................................................353.6.4 Consent ..............................................................................................................36

3.7 Summary and Conclusions ......................................................................................38

4. PISA System Design Requirements........................................................................................404.1 Trustworthy Interface Design Requirements...........................................................404.2 Usable Compliance Design Requirements ..............................................................42

5. UML Modelling & Privacy Interface Analysis......................................................................445.1 The Privacy Interface Analysis Methodology .........................................................44

5.1.1 Develop a Service/Application Description.......................................................455.1.2 Explore and Resolve the HCI Requirements......................................................46

5.2 The PISA Privacy Interface Analysis......................................................................48

6. The PISA Interface Prototype.................................................................................................496.1 Satisfying the HCI Requirements in the Interface Prototype ..................................506.2 Integration with the Main PISA Demonstrator........................................................50

Page 8: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 8 / 139

7. Usability Evaluation Plan .......................................................................................................52

8. References...............................................................................................................................53

9. Appendices..............................................................................................................................569.1 Detailed PISA Interface Requirements Analysis.....................................................569.2 HCI UML Diagrams for the PISA Demonstrator....................................................669.3 Prototype Interface Code Documentation ...............................................................82

Page 9: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 9 / 139

List of FiguresFigure 2.1: Explanation of twice-removed transactions.......................................................................... 15Figure 2.2: Lee, Kim, & Moon's model of e-commerce loyalty. ............................................................ 17Figure 2.3: A model of agent success...................................................................................................... 18Figure 3.1: Schematic representation of our approach. ........................................................................... 26Figure 3.2: A door with poor affordances. .............................................................................................. 36Figure 3.3: An example of a Just-In-Time Click-Through Agreement (JITCTA). ................................. 39Figure 5.1: Illustration of the major PISA tasks or modules. .................................................................. 44Figure 5.2: Use Case Diagram for the PISA Demonstrator. ................................................................... 45Figure 5.3: Object sequence diagram for the Register use case. ............................................................. 46Figure 5.4: A possible track agent interface screen illustrating HCI solutions. ...................................... 48Figure 6.1: Screen capture of the main navigation screen in the interface prototype. ............................ 50Figure 7.1: Schematic represenation of remote usability testing setup. .................................................. 52

Page 10: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 10 / 139

List of TablesTable 2.1: Interface Characteristics That Build Trust ............................................................................. 21Table 2.2: Visual Interface Components that Build Trust....................................................................... 22Table 2.3: System Behaviors That Build Trust ....................................................................................... 22Table 2.4: Approaches to Personal Risk Analysis................................................................................... 23Table 3.1: Privacy Roles Defined by The Directive................................................................................ 29Table 3.2: High-Level Summary of Privacy Principles .......................................................................... 30Table 3.3: Privacy Principles, HCI Requirements, and Design Solutions .............................................. 31Table 3.4: Guidelines for Creating Click-Through Agreements ............................................................. 37Table 4.1: PISA Interface Feature Requirements.................................................................................... 40Table 4.2: PISA Supporting Information Requirements ......................................................................... 41Table 4.3: PISA Overall System Characteristic Requirements ............................................................... 41Table 4.4: Requirements for Use Case: Registering & Agreeing to Service Terms .............................. 42Table 4.5: Requirements for Use Case: Creating a Job Search Task ...................................................... 42Table 4.6: Requirements for Use Case: Tracking a Task ........................................................................ 43Table 4.7: Requirements for Use Case: Modify a Task .......................................................................... 43Table 4.8: Requirements for Use Case: Handling System Messages...................................................... 43Table 6.1: Notable HCI Design Feature in the Interface Prototype ........................................................ 51

Page 11: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 11 / 139

1. Introduction

This is the first report of the Human-Computer Interaction (HCI) research and development activitiesbeing conducted in the PISA project. HCI is the study of mental processes and behavior as they pertainto users interacting with computers (and other technical devices). HCI is often thought of as a blend ofcognitive psychology and computer science, and the discipline is usually taught in one or both of theseuniversity departments. An important sub-topic in HCI is interface design, the discipline of creatingeasy-to-use and effective control systems. Interface design is important for privacy because the usersinteract with the software or service through the interface, so all the privacy features must berepresented in the interface design.

The goal of the HCI activities in the PISA project is "to understand and develop technologies that willimprove human interaction with the distributed virtual marketplace that is emerging in the informationand communications industries" (from the original PISA Description of Work). More specifically, thetask is to build an agent-based service that people will trust with sensitive, personal information and onethat will operate according to privacy-protection requirements coming from legislation and bestpractices.

To meet these goals, two different research activites have been conducted. The first was to carefullyexamine the concept of "trust" and review what is know about building trustworthy systems. The resultof this work is a set of system design guidelines for building trustworthy agent systems. Theseguidelines are being used in the development of the PISA Demonstrator, and they are also of interest toother system designers. This research activity is described in Chapter 2, and a journal article describingthis research has been published in a leading computer science journal (Patrick, 2002).

The second research activity was to examine the privacy legislation and principles to determine thehuman-factors implications and consequences. This research involved a phrase-by-phrase analysis ofthe European Privacy Directive (European Commission, 1995) to determine the human behaviorrequirements that were implied by the legal constructs. Interface design techniques were then outlinedfor each of the requirements, and specific design solutions were developed for the PISA Demonstrator.The result is a set of recommendations for implementing "usable compliance" with privacy legislationand principles. This research activity is described in Chapter 3, and a paper describing this research hassubmitted to an upcoming PET workshop (Patrick & Kenny, 2002).

The other chapters of this report describe how these research results have been put into practice inbuilding the PISA Demonstrator. Chapter 4 summarizes the system design requirements for the PISAproject from a human-factors point of view. Chapter 5 describes how UML modelling techniques wereused to determine how the requirements would be satisfied in the PISA system. Chapter 6 describes anPISA Interface Prototype that was developed to illustrate and test the system requirements and interfacedesign solutions. This section also reviews how the system design requirements have been satisfied inthe interface prototype, and how the HCI knowledge and techniques are being integrated with the agent-platform software and security mechanisms in the PISA Demonstrator. Chapter 9 introduces theusability testing that will be conducted as the next HCI activity of the PISA project. Finally, theAppendixes in Chapter 9 provide detailed listings and diagrams on the PISA interface requirements, theHCI UML diagrams, and the PISA Interface Prototype.

Page 12: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 12 / 139

2. Building Trustworthy Software Agents

2.1 Trust and AgentsThere is increasing interest within the software community in developing intelligent software agents.This interest is a result of a growing frustration with using direct manipulation interfaces (i.e., mice,GUIs) for increasingly complex tasks, information overload, and a need to exploit the rapidly expandingnetwork of distributed information and services. These trends are leading to a desire for software thatexplores, anticipates, adapts, and actively assists its users in ways not possible today (Bradshaw, 1997).In addition, software that acts on behalf of a user may be useful for protecting the identity and privacyof the user. By including privacy protection measures and having an agent perform tasks on behalf of auser, anonymity can be maintained and the agent can share only the personal information that the userdesires.

An agent can be defined as an entity that operates autonomously without direct user control, but undercommands previously issued by the user. A classic example of an agent is a butler or secretary whomakes decisions and commitments on behalf of their bosses. There is often a close relationship betweenthe agent and their "user" so that, for example, the butler learns his boss' likes and habits and is able toanticipate and respond effectively, even if the boss is not present. The idea behind software agents is tocapture the power and effectiveness of human-human, boss-butler relationships in human-computer,user-agent software systems. The goal is to develop software that acts like a butler or secretary, takingactions, anticipating problems, making decisions, and improving the life of the user (Negroponte, 1997).

The level of autonomy and independence of an agent can be described in terms of "active" agents thatindependently perform actions on behalf of the users (e.g., make purchases or business commitments),and "advice" agents that merely provide advice or suggestions for the users to consider. In addition, thesophistication of agents can range from simple scripts that run periodically on a user's machine, tocomplex programs that travel autonomously across a network while performing remote tasks on behalfof the user (mobile agents).

Most of the software agents in use today (see www.agentland.com for examples) are relatively simpleadvice systems. For example, Lieberman et al.'s (2001) Letizia system is an agent that autonomouslysearches for WWW pages based on what users are currently viewing in their browser. The agent simplypresents pages that may be of interest and the user can choose to attend to or ignore these suggestions.One example of an active agent in use today is the proxy bidding service found on eBay.com(http://pages.ebay.com/help/buyerguide/bidding-prxy.html). This agent autonomously submits bids onbehalf of the user according to a maximum price specified when the agent is launched. This agent trulyacts on behalf of the user because it makes financial commitments without direct user control.

2.2 Concerns About AgentsAs agents become more active and more sophisticated, the implications of their actions and any errorsthey make will become more serious. With today's GUI interfaces, errors made by the user or softwarecan often be easily fixed or "undone". An agent performing actions on behalf of a user could makeerrors that are very difficult to "undo" (e.g., making a purchase commitment) and, depending on thecomplexity of the agent, it may not be clear what went wrong. For example, the agent may have failedto "understand" the instructions, or made an error during execution (Erickson, 1997).

Page 13: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 13 / 139

Moreover, in order for agents to operate effectively and truly act on behalf of their users, they may begiven information that is confidential or sensitive. This includes financial details (e.g., credit cardsnumbers) and personal contact information (e.g., telephone numbers) that should not be sharedindiscriminately on public networks. Thus, along with the excitement about agents and what they cando, there is concern about the security and privacy issues that will result. Negroponte (1997) describesthe ideal agent as the equivalent of "a well-trained English butler" who knows your needs, likes andhabits. Negroponte goes on the to describe the privacy issues:

All of us are quite comfortable with the idea that an all-knowing agent might live in ourtelevision set, pocket, or automobile. We are rightly less sanguine about the possibility ofsuch agents living in the greater network. All we need is a bunch of tattletale or culpableagents. Enough butlers and maids have testified against former employers for us to realizethat our most trusted agents, by definition, know the most about us. (p. 62)

In order for agents to be accepted, users will have to trust them with private information, and the agentswill have to handle that information in a secure fashion. This trust becomes very important where usersmay suffer physical, financial, or psychological harm because of the actions of an agent (Bickmore &Cassell, 2001). It is not enough to assume that well-designed software agents will provide the securityand privacy users need. Assurances and assumptions about security and privacy need to be madeexplicit to the user. Without this information the users may assume that systems are not secure andprivate when they are, or that their privacy is being protected when it is not. For example, users ofcorporate e-mail systems often assume a high degree of privacy, when in fact there can be very little.Courts have repeatedly ruled that employers can use private e-mail messages and such messages havebeen used in court cases (Weisband & Reinig, 1995). Developing and maintaining the appropriatelevels of trust will be very difficult. The focus of the current paper is to review the human-factors issuesrelevant to developing trusted agents. The interface and system design issues that can lead to trust willbe reviewed, along with the factors that increase perceived risk. The combination of trust and risk willdetermine the willingness of users to accept and use agent technologies.

2.3 The PISA Reference Case: The Job-Searching AgentTo facilitate discussions, the PISA researchers have defined a reference agents to be described andexplored in detail. The specific example is a job-searching agent, which will search the Internet for jobson behalf of its users. The agent will carry information about the user, including sensitive informationsuch as the current employer, salary history and salary expectations. The agent will also know thepreferences and career aspirations of the user, and will use this information when traveling to differentjob search sites on the Internet. The agent must match the requirements and characteristics of potentialnew jobs with the information it knows about its user. It may even modify the description of its user tofit the requirements of the position (e.g., emphasizing managerial experience for a business position oremphasizing publications for an academic position). Moreover, it will most often do this withoutrevealing the full details about the identity of the user. This is important when users do not want theircurrent employer to know they are searching for a new position. Other information, such as salaryexpectations, will have to be used when searching for suitable positions, but not be revealed to potentialemployers until an appropriate time.

The human-factors issues associated with agent technologies can be explored in this reference scenario.Some obvious example questions are:

Page 14: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 14 / 139

• What interfaces are appropriate for instructing agents about the information to share, and when?

• How can the system provide reassurance that a user's instructions were followed? How canusers look for errors or problems?

• What interface needs to be built so users can track the actions of their agents?

2.4 Agents and TrustIt is clear that a trusting relationship must develop between the user and the agent. Users must beconfident that the agent will do what they have asked, and only what they have asked. Moreover, to beeffective the agent must be trusted with sensitive information, and use it only in appropriatecircumstances. Since the trust between a user and an agent is so important, it is useful to examine thenature of trust in detail.

2.4.1 What is Trust?

Most generally, trust can be defined as "a generalized expectancy… that the word, promise, oral orwritten statement of another individual or group can be relied upon" (Rotter, 1980, p. 1). In the contextof software agents, this means that the agent can be relied upon to do what it was instructed to do. Buttrust is more than that; it is "the condition in which one exhibits behavior that makes one vulnerable tosomeone else, not under one's control" (Zand, 1972). Without the vulnerability, there is no need for thetrust. In the context of software agents, it means no longer controlling the software directly, letting theprocess act on one's behalf, and accepting the risks that this may entail. Bickmore and Cassell (2001)go on to describe trust as "people's abstract positive expectations that they can count on [agents] to carefor them and be responsive to their needs, now and in the future" (p. 397).

This concept of making oneself vulnerable in order to accomplish a goal is essential for understandingtrust. Without trust virtually all of our social relationships would fail and it would become impossibleto function normally. If we can't trust the oncoming driver to stay in their lane, then it would becomeimpossible to drive. If we don't trust the shopkeeper to deliver the goods we pay for, then simplepurchases would become very awkward. We make ourselves vulnerable to others every day, but we areusually comfortable in doing so because we trust that their actions will not be inappropriate or harmful.Bickmore and Cassell (2001) describe trust as a process of uncertainty reduction. By trusting others toact as we expect them to act, we can reduce the things we have to worry about.

Taking a computer science approach, Marsh (1994) has defined trust in terms of the behavior of theperson doing the trusting. Thus, trust is "the behavior X exhibits if he believes that Y will behave in X'sbest interest and not harm X". In the context of agents, this means behaving in a way that is appropriateif the agent will always have your best interests in mind, and cause you no harm.

For our purposes, then, trust can be defined as users' thoughts, feelings, emotions, or behaviors thatoccur when they feel that an agent can be relied upon to act in their best interest when they giveup direct control.

Page 15: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 15 / 139

2.4.2 The Problem of Trusting Agents: Interactions Twice-Removed

Users may have difficulty trusting software agents because the user ends up working on a task that istwice-removed from the interface (See Figure 2.1). Consider the example of a user who is using a job-searching agent. A traditional, non-removed method of searching for a job would be to talk toemployers directly, perhaps by visiting their offices. Here the job seeker is interacting directly with thepotential employer to get information about the position (the top panel in Figure 2.1). A more modernmethod of searching for a job is to work in a computer-mediated fashion where the job seeker interactswith a computer program, perhaps a WWW browser, to view information that has been created by theemployer (the middle panel in Figure 2.1). Thus, the interaction between the job seeker and theemployer is once-removed. (Riegelsberger & Sasse, 2001, refer to this as a dis-embedded transaction.)With a job-searching agent, the job seeker would interact with a computer program, perhaps an agentcontrol interface, to provide instructions to the agent. The agent, in turn, would search the Internet andgather information that has been provided by the employer. There is no direct connection between theuser and the job-seeking activities (the bottom panel in Figure 2.1). Thus, the interaction between thejob seeker and the potential employer is twice-removed (or dis-dis-embedded).

Figure 2.1: Explanation of twice-removed transactions.

Research has shown that developing trust during once-removed interactions can be difficult, let alonetrusting in twice-removed interactions. For example, Rocco (1998) showed that interpersonal trust isreduced markedly when communication is computer-mediated. Also, a numbers of studies, to besummarized below, have found that it can be quite difficult to develop trust during once-removed e-commerce interactions.

There are many valid reasons why users may be hesitant to trust software agents. Cheskin (1999)argued that disclosing personal information might involve more personal risk than financial interactionsbecause personal assets like self-respect, desirability, reputation, and self-worth can be more valuablethan money. Also, since agents operated autonomously outside of the user's vision and control, thingsmay go wrong that the user does not know about, or cannot correct.

Page 16: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 16 / 139

Youll (2001) has also described the issues involved in trusting agents. First, the user must make theirinstructions clear to the agent. This instructing phase could fail for a number of reasons: (1) the userdoes not clearly define the instructions, (2) the agent does not fully understand the instructions, or (3)the user and the agent interpret identical instructions differently.

Second, if the instructions have been understood, the user must be confident that the agent will executeits instructions properly, and only perform the tasks that the user intended. Third, the user must beconfident that the agent will protect information that is private or sensitive. Finally, regarding theconfidentiality of the information entrusted to the agent, the user must have confidence that the agent isnot attacked or compromised in some way, such as through "hacking" or "sniffing". With all of theseconcerns, developing a trusting relationship between users and their agents is a difficult task.

On the other hand, there are also valid reasons why users might make the choice to trust agents. AgainYoull (2001) describes the advantages that agents can bring to a task. Due to the twice-removed natureof the interactions between the end-user and the task, agents are well suited for tasks that require highdegrees of privacy. An agent can establish its own identity on the network, and protect the identity ofthe end-user. An example of how this can be done was seen in the Lucent Personalized Web Assistant(LPWA; Gabber, et al. 1999), which acted as a proxy for users who wanted to navigate the WWWwithout revealing their true identities. Such services can even go so far as to establish new pseudonymsfor each and every transaction, making it very difficult to establish a link back to the user.

Agents are also well suited for situations where interaction policies need to be established and followed.Since software agents are embodied in explicit computer code, it is possible to establish and followclearly defined privacy policies, rather than relying on heuristics or emotions.

2.5 Building Successful Agents: A Summary ModelMost of the research to date on privacy and trust has been focused on (once-removed) e-commerceinteractions. However, the lessons are very relevant and extendable to agent interactions, and theyprovide a good starting point until more research is conducted on agent technologies. An importantcontribution to research on e-commerce trust is a path model of e-commerce customer loyalty proposedby Lee, Kim, & Moon (2000), as is shown in Figure 2.2. These authors describe how attitudes towardse-commerce will be determined by the amount of trust instilled in the user, and the amount of costperceived by the user. Trust and cost combine together, in opposite directions, to determine the overallacceptance. In addition, Lee et al. identify a number of factors that contribute to trust, such as sharedvalues and effective communication. They also identify factors that lead to perceived cost, such as thelevel of uncertainty.

An extended model of agent acceptance developed for this paper is shown in Figure 2.3. Hereacceptance of the agent technology is determined by the combination of trust and perceived risk. Thecontributing factors identified by Lee et al. are included, along with factors identified by otherresearchers. This section reviews this model of agent acceptance in detail.

Page 17: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 17 / 139

Figure 2.2: Lee, Kim, & Moon's model of e-commerce loyalty.

An important feature of Lee et al.'s e-commerce model, and the model of agent acceptance proposedhere, is the separation of trust from perceived risk. The idea is that feelings of trust and risk can beestablished quite independently, and together they determine the final success of the agent technology.Trust contributes to the acceptance of the agent in a positive direction, while risk contributes in anegative direction. The effect is that the two factors interact with each other, so that agents instilling alow degree of trust may still be successful if there is also a low perceived risk. On the other hand, invery risky situations it may be that no amount of trust will offset the risk perceived by the user, and theagent will never be accepted. Rotter (1980), in his review of the social psychology of interpersonaltrust, supports this idea that trust and risk are separate concepts, and both contribute to the final behaviorof an individual. Grandison and Sloman (2000) also describe trust and risk as opposing forces thatcombine during decision making about a service or an e-commerce transaction.

Another important feature of the model is that the risk being described is the risk perceived by the user.This perception may, or may not, be related to the actual risk of the technology employed in the agentsystem. For example, the job-seeker's personal information might be encrypted with a very strongencryption technique, but if the user believes that the information will be disclosed inappropriately, thisfear contributes to the perceived risk, and works against acceptance of the agent technology.

2.5.1 Factors Contributing to Trust

As is shown in Figure 2.3, trust is a complex, multifaceted concept that is influenced by a number offactors (e.g., Grandison & Sloman, 2000). In this section a number of factors contributing to feelings oftrust are described, and specific design recommendations are made for building trustworthy agents.

2.5.1.1 Ability to Trust

The first factor that contributes to the trust a user may place in an agent service is their ability to trust.A number of researchers have proposed that people have a general ability to trust that forms a kind ofbaseline attitude when they approach any trust situation, and some people have a higher baseline levelof trust than others. For example, Marsh (1994) describes "basic trust" as a person's general propensityto trust or not trust. This basic trust is part of their personality, and is one of the factors that contribute

Page 18: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 18 / 139

when making decisions about trust. Similarly, Rotter (1980) showed that there is a generalized trust thatis "a relatively stable personality characteristic" (p. 1). Rotter also demonstrated that high and lowtrusters had markedly different opinions and behaviors (e.g., high trusters were less likely to cheat or lie,were seen as happier, and more attractive).

Figure 2.3: A model of agent success.

Directly related to the issue of trust on computer networks, Craner et al. (1999) surveyed Internet usersabout their attitudes towards privacy and trust. The survey respondents were then classified into groupsthat differed in their concerns about online privacy, following a scheme originally proposed by Westin(1991). The first group (27%) was only marginally concerned with online privacy and was quite willingto provide personal information when visiting WWW sites. This group did have some concerns, such asthe desire to remove themselves from marketing mailing lists, but they were generally quite trusting.The second group (17%) was at the opposite extreme, and was labeled "privacy fundamentalists".These users were extremely concerned about privacy and were generally unwilling to provide anyinformation to WWW sites, even when privacy protection measures were in place. The third and largestgroup (56%) was labeled the "pragmatic majority" because they had some concerns about privacy, butalso had developed tactics for dealing with those concerns. For example, they would often look forprivacy protection methods or statements when navigating the WWW.

Thus, we have abundant evidence that people differ in their basic tendency to trust. Perri 6 (2001; yeshis surname is the numeral 6) cautions, however, that basic trust can be misleading because people'sperceptions are heavily modified by the context. He suggests the people's trust can change quicklydepending on the context and their experience, and it is important not to overemphasize the role ofgeneral personality characteristics.

Page 19: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 19 / 139

When building agent systems that users will have to trust, developers should take into account the factthat users may differ in their general ability to trust. Some users may willingly trust an agent systemwith little reassurance of the privacy protection measures in place, while others may be very reluctant togive their trust. This means that interfaces must be flexible and be able to provide more information andreassurance for users that require it.

2.5.1.2 Experience

The second factor that contributes to trust is experience. It is clear that users can change theirwillingness to trust based on their experiences (Marsh, 1994; 6, 2001). If they have been harmed insome way, for example, they may be less willing to trust in the future. This change in trust may bespecific to the situation or it may be a change in their general ability to trust. Changes in trust can alsocome about indirectly because of the experiences or recommendations of others (Grandison & Sloman,2000). This means that trust can be "transitive", being passed from user to user.

Designers of agent systems should ensure that users are able to have positive experiences so they candevelop trust. This means providing ample information on the operation of the agent (feedback). Inaddition, designers should support a sharing function so users can relate their experiences and trustingattitudes can be shared and spread (assuming the experiences are positive ones). This may meancollecting testimonials or anecdotes that can be shared with other users.

2.5.1.3 Predictable Performance

Another factor that can lead to agent trust is predictable performance. Systems and interfaces thatperform reliably and consistently are more likely to be trusted by users. Bickford (1997) describes threeimportant principles for predictable performance and its role in building trust:

1. consistency: The interface and system behave the same way each time they are used. Forexample, certain functions are always accessed in the same way, and always lead to theexpected result.

2. aesthetic integrity: The interface has a consistent look and feel, throughout the entire system.This includes the page design, buttons, text styles, etc.

3. perceived stability: The system should appear stable to the user. It should not crash. Thereshould be no changes without users' knowledge, and users must be kept informed about anyoperational issues, such as upgrades or downtimes.

Another aspect of predictable performance is response time. Users prefer response times that areconsistent and predictable, rather than variable and unpredictable (Shneiderman, 1997).

The resulting recommendation is that developers should ensure that the interface is consistent andpredictable. This may mean adopting a style guide or interface guideline that is used in all parts of thesystem. Developers should also ensure that the system behaves consistently, and appears to be stable.Human factors evaluation techniques that may be useful for testing these aspects of a design arereviewed in Section 4.

Page 20: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 20 / 139

2.5.1.4 Comprehensive Information

Another important factor in determining users' trust of a system is the amount of information provided.Systems that provide comprehensive information about their operation are more likely to be understood,and more trusted. Norman (2001) suggests that agent systems must provide an image of their operationso that users can develop a mental model of the way the system works. It is through this model that theywill develop expectations and attitudes towards the system. Norman agues that users will developmental models and assumptions about the system even when no information is provided, and thesemodels may be wrong. To prevent this, developers should explicitly guide the model development byshowing the operation of the system.

The importance of internal models of system operation was recently demonstrated by Whitten & Tygar(1999). This study tested users ability to use a PGP system to certify and encrypt e-mail. The resultsshowed that the majority of the users were unable to use the system to perform the task. In fact, 25% ofthe users e-mailed the secret information without any protection. An analysis of the errors and anevaluation of the interface led these researchers to conclude that the major source of the problems wasthat users did not understand the public key model used in the PGP system. The PGP interface that wastested failed to provide the comprehensive information about how public key encryption works, and theroles and uses of public and private keys. Without this information, users often developed their ownideas about how the system worked, with disastrous results.

Another example of a system that does not provide comprehensive information is the "cookies" moduleused in WWW browsers (Bickford, 1997). Cookies are small files that are assembled by WWW sitesand stored on users' computers. Later, they can be retrieved by the WWW sites and used to identifyrepeat visitors, preferences, and usage patterns. The problem with cookies is that they can store avariety of information about users (including sensitive information) and yet their operation is invisible.Unless users explicitly change their browser options, they do not know when cookies are created orretrieved. In addition, most WWW browsers do not provide any way of viewing the cookies. Theyomit such simple functions as listing what cookies are stored on a system, and an ability to view theinformation stored within them. The P3P initiative (Reagle & Cranor, 1999) is an attempt to give usersmore control over cookies.

Developers of agent technologies must provide comprehensive information about how the systemworks. The role of the agent must be explained, and its operation must be obvious. This may meanallowing users to observe and track the actions performed by an agent, both in real-time and after thefact. In addition, effective interfaces should be developed for viewing and altering the informationstored by agents.

2.5.1.5 Shared Values

Another factor that can lead to users trusting agents is the establishment of shared values between theuser and the agent. That is, to the extent that the user feels that the agent values the things that theywould, they will have more trust in the agent. In interpersonal relationships, these shared values areoften built through informal interactions, such as the small talk that occurs in hallways or during coffeebreaks. Bickmore and Cassell (2001) tested the role of small talk (informal social conversation) inbuilding trustworthy agents. These researchers included small talk capabilities in a real estatepurchasing agent called REA. REA was a life-sized conversational agent embodied as an animatedfigure on a large computer screen. REA was able to engage in small talk conversations designed toincrease feelings of closeness and familiarity. For example, REA conversed about the weather, shared

Page 21: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 21 / 139

experiences, and her laboratory surroundings. In one experiment, a condition that included small talkinteractions was compared with another condition that only involved task-oriented interactions. Thetask in the experiment was to determine the users' housing needs, and this included gathering personalinformation about how much the user could afford to spend, and how large a house was required. Whenmeasures of trust and willingness to share personal information were examined, the results showed thatthe condition that involved informal social dialogues led to higher levels of trust among extrovertedusers (it is not clear why this effect was not found for introverted users).

Values between agents and their users can also be shared explicitly. For example, privacy policies canbe clearly articulated so that users can compare their concerns with the policies in place (Cheskin,1999).

2.5.1.6 Communication

Another factor that determines the amount of trust is the amount and effectiveness of communicationbetween the agent and the user. Norman (1997) argues that continual feedback from the agent isimportant for success. This feedback should include having the agent repeat back its instructions so it isclear what the agent understood. Also, error messages should be constructed so that it is clear what wasunderstood, and what needs to be clarified. In addition, through communication it should be made clearwhat the capabilities and limits of the agent are.

2.5.1.7 Interface Design

The final factor that can contribute to trust of an agent is the design of the interface itself. This meansthe look and feel of the software that is used to control the agent. This area includes such factors asappearance, functionality, and operation. Many of the generic attributes of good interface design alsoapply to designing agent interfaces. So, Norman's (1990) recommendations about "visible affordances"are relevant here, which means that whenever possible the function of an interface component should beclear from its visible appearance.

Table 2.1: Interface Characteristics That Build Trust

1. brand: Trust can be influenced by the extent that users are already aware of theservice provider, and their feelings about that provider. Providers that arealready trusted in other contexts, such as real-world stores, may also betrusted in the new context.

2. navigation: Trust is influenced by the ease of finding things, which resultsfrom clear, logical presentation and consistent design.

3. fulfillment: Trust is enhanced if the process for getting a task done is clear andtraceable.

4. presentation: Feelings of trust can be increased if material is presented clearly,if the layout is clean and functional, and the presentation is professional.There are reasons why banks appear as they do -- to instill trust. Brokenwindows and peeling paint do not instill trust. The appearance should beprofessional and official looking, like money or certificates.

5. technology: Trust can be built if the site appears to work smoothly and quickly.6. logos of assurance: Including icons and text that represent seals of approval or

assurances of safety can increase feelings of trust.

Page 22: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 22 / 139

Cheskin (1999) completed an examination of interface designs that can communicate feelings of trust.They did this in the context of e-commerce WWW sites, but the lessons can also be applied to agentsystems. In this study users were invited to comment about different e-commerce WWW sites thatdiffered in interface design. The results were summarized in six fundamental interface characteristicsthat communicate trust that are listed in Table 2.1 (many of these points reinforce the factors describedabove).

In another empirical study, Kim & Moon (1998) examined a number of interface design factors andtheir effects on levels of trust in an e-banking service. The result was a series of recommendations onthe visual components of the interface, listed in Table 2.2

Table 2.2: Visual Interface Components that Build Trust

1. use "clip art" graphics, and large, 3-dimensional images2. use cool colours 3. use pastel shades4. use low brightness5. use colors symmetrically

Riegelsberger & Sasse (2001) also examined the role of various interface behaviors in building trust. Inthis study a mockup e-commerce interface was developed to include various interface behaviors.Potential users then "walked through" the interfaces and provided comments (see Section 4 for adescription of assessment methods). The result was a list of interface behaviors that could build trust,shown in Table 2.3, and these again reinforce some of the factors reviewed above.

Table 2.3: System Behaviors That Build Trust

1. status indicators: Allowing the user to see the status of the actions canincrease confidence and trust.

2. displaying data already entered: Displaying the data to be used by the systembefore it is launched can increase trust.

3. continuous visibility: Making the operation of the system transparent isimportant.

4. tracking: Allowing tracking of the activities can build trust.5. recourse: Providing a mechanism to recall or undo an action can lead to more

confidence and trust in an interface.6. trial runs: Supporting trial runs or demonstrations may be a good technique to

reassure users that the system is operating as they intended.7. fast response times: providing fast, consistent response times can lead to

positive feelings about the system being used.

A controversial issue in designing trustworthy interfaces is the value of anthropomorphism. Doescreating a human-like interface, perhaps with an animated character and a conversational interface, leadto more feelings of trust? Bickmore and Cassell (2001) argue that an animated character that canengage in small talk conversations can lead to shared values and higher trust in some users. Similarly,Laural (1997) argues that we have years of experience interacting with other people, and these skills canbe transferred to interactions with computers. Moreover, agent systems are human-like because of their

Page 23: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 23 / 139

ability to perform autonomous actions, so an anthropomorphic interface is appropriate. However, others(Norman, 1997; Riegelsberger & Sasse, 2001; Erickson, 1997) have argued that suchanthropomorphism can lead to disappointment if the interface does not live up to expectations. If theagent cannot really behave like a human, then having a human-like interface may actually diminish trustrather than build it. Erickson (1997) has relayed some anecdotes where users question the motivation ofhuman-like "guides", and sometimes became quite angry if the character does not behave as expected.Thus, developers of agent systems should only consider anthropomorphic interfaces if they truly reflectthe abilities and behaviors of the agent system. Since such human-like abilities are a long way off, it isprobably most appropriate to avoid anthropomorphism.

2.5.2 Factors Contributing to Perceived Risk

The other side of the model of agent success is perceived risk. Other things being equal, users will bemore willing to use agent systems if they perceive the risks to be lower. The amount of perceived risk isinfluenced by a number of factors.

2.5.2.1 Risk Perception Bias

Similar to basic trust, users may have a basic or baseline level of perceived risk. This is probably bestdescribed as a bias to perceive situations as being risky or risk free. This bias in risk perception hasbeen described by Perri 6 (2001) as 4 basic approaches to risk analysis, and these are listed in Table 2.4.

Table 2.4: Approaches to Personal Risk Analysis

1. fatalism: users feel that they have no control, and risk decisions are out oftheir hands

2. hierarchy: users feel that risks should be contained by controls and regulation3. individualism: users feel that risks should be taken when appropriate for the

individual4. enclave: users feel that risks are systemic and should be handled with

pressure, dissent, and market systems

Agent system designers should consider these basic approaches to risk assessment. It may be useful todesign system features that address each of these areas. For example, an agent system may containinformation to explain how users can have control over the risks they are taking. Also, a system caninclude information about the controls being put in place and the regulations that are being followed.Finally, allowing users to share information and experiences, and communicate with the systemdevelopers, may lead to feelings of empowerment and fewer concerns about risk.

2.5.2.2 Uncertainty

Another method to reduce risk perception is to reduce uncertainty. The more users know about a systemand how it operates, the less they worry about taking risks (assuming all that they learn is positive).This is highly related to the "comprehensive information" and "communication" factors for buildingtrust.

Page 24: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 24 / 139

2.5.2.3 Personal Details

An obvious factor in risk perception is the amount of sensitive information being provided. If morepersonal details are being provided to the agent, perceptions of risk are likely to increase. Systemdevelopers should only ask for information that is necessary to do the job, and avoid where possibleinformation that may be especially sensitive. Exactly what information the users consider sensitive mayrequire some investigation. For example, Cranor, Reagle, & Ackerman (1999) found that phonenumbers were more sensitive than e-mail addresses because unwanted phone calls were more intrusivethan unwanted e-mail messages.

2.5.2.4 Alternatives

Another factor that can lead to feelings of risk is a lack of alternative methods to perform a task. Forexample, if the only method to search for a job is to use a new agent technology, users may feel they aretaking more risks than situations where there are multiple methods (i.e., non-agent WWW interfaces,phone calls, employer visits).

2.5.2.5 Specificity

Similarly, if there is a sole supplier of a service, users may feel they are at more risk from exploitationthan situations where there are multiple suppliers. In the job-searching example, it means that usersmay be more comfortable if there are multiple job searching agents to choose from.

2.5.2.6 Autonomy

Perhaps the most important factor in determining users' feelings of risk towards an agent technology isthe degree of autonomy granted to the agent. As discussed previously, agents can range from low riskadvice-giving systems to higher risk, independent acting agents. Lieberman (2002) advocatesdeveloping advice agents and avoiding, for now, agents that truly act on their own. Advice systemshave the advantage that they can stay in close contact with the user and receive further instructions asthey operate. Further, advice agents can learn by example as they monitor what advice their usersaccept. In the job-searching example, it may be most appropriate for the agent to suggest possible jobsthat the user should apply for, rather than completing the application autonomously.

2.6 Checking Your Work: Human-Factors Evaluation TechniquesMost of the standard human factors evaluation techniques are appropriate when developing agenttechnologies. It is beyond the scope of the current paper to provide an exhaustive review, but thissection does present a brief review and some notes about the particular applicability for agent design.(Interested readers should consult one of the many books available on usability testing, such as Nielson,1993; Mayhew, 1999; or Shneiderman, 1997.)

The first evaluation technique to consider is qualitative research. Here researchers talk to potentialusers about a variety of topics that are important during the design phases. These conversations may beone-on-one interviews or focus group sessions. For example, researchers might conduct a needsanalysis to determine the tasks that should be performed by the agent, and how it should beaccomplished. Users may also be questioned about their preferences and concerns, and this may beparticularly important for discovering concerns about privacy and sensitive information.

Page 25: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 25 / 139

Another technique that will be valuable during the early design stages is heuristic evaluation. Hereresearchers with expert knowledge examine a prototype system against a set of criteria. These criteriamay come from general background knowledge about human factors, or specific recommendations suchas those presented earlier in Section 3. A related technique is a cognitive walk-through, where users arebrought in and asked to interact with a system under development. Here users are asked to think aloudand provide comments as they try out the system. They may be given specific tasks to perform, andquestions to guide their comments. Heuristic evaluations and walk-throughs can be very powerful fordetermining potential problems early in the design process, before much effort is spent building acomplete system.

The final evaluation technique is a formal empirical test. In these tests users interact with a completesystem, and specific performance measures are recorded under controlled conditions. For example,researchers may record the number and type of errors, or the time needed to complete a task. Empiricaltests can be expensive and time-consuming to conduct, so they are often reserved for the final stages ofproduct development.

2.7 ConclusionsIntelligent, autonomous agents have the potential to facilitate complex, distributed tasks and protectusers' privacy. However, building agents users will trust with personal and sensitive information is adifficult design challenge. Agent designers must pay attention to human factors issues that are known tofacilitate feelings of trust. These include providing transparency of function, details of operation,feedback, and predictability. They must also consider factors that lead to feelings of risk taking. Thismeans reducing uncertainty, collecting the minimal amount of information, and carefully consideringthe amount of autonomy an agent will have.

Page 26: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 26 / 139

3. From Privacy Legislation to Interface Design

There is increased awareness by the general public of their right to, and the value of, their privacy.Recent surveys indicate that Internet users are very concerned about divulging personal informationonline, and worried that they are being tracked as they use the Internet (Kobsa, 2002). Research hasindicated that users are failing to register for WWW sites because they feel that they cannot trust theInternet with personal or financial information (Saunders, 2001). In addition, information privacy isincreasingly being associated with business issues such as reputation and brand value (Kenny &Borking, 2002). Moreover, governments within the European Union, Canada, Australia, andSwitzerland have adopted privacy protection legislation that is enforced through independentgovernmental bodies with significant oversight powers. There has been little guidance, however,provided to system developers and operators on how to implement and comply with these privacyguidelines and rules, and how to soothe users' privacy concerns. The goal of this chapter is to documenta process that begins with privacy legislation, works through derived privacy principles, examines theHCI requirements, and ends with specific interface design solutions. The approach taken is one of"engineering psychology" in which knowledge of the processes of the brain is used when doing systemdesign (Wickens & Hollands, 2000).

In the sections that follow we explain how the European Privacy Directive 95/46/EC (EC, 1995) hasbeen analyzed to produce a set of detailed privacy principles. The principles are then examined from ahuman factors point of view and a set of HCI requirements are developed. Figure 3.1 shows aschematic representation of the approach we have taken. The left side of the figure shows how theexamination proceeds from the EU Privacy Directive through to a set of HCI requirements andcategories, and this process is documented in this chapter. The right side of Figure 3.1 shows the aproposed "Privacy Interface Analysis" methodology that begins with a thorough understanding andmodeling of the software or service and ends with specific interface solutions. Detailed instructions andexamples for the Privacy Interface Analysis are presented in Chapter 5. Overall, our intent is tointroduce the core concepts of privacy protection and HCI requirements, and then illustrate a PrivacyInterface Analysis that other developers can follow.

Figure 3.1: Schematic representation of our approach.

Page 27: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 27 / 139

To illustrate the Privacy Interface Analysis technique, we use the example application adopted by thePISA consortium. In the PISA Demonstrator, each user has a personal agent to which he can delegatetasks such as searching for a job or making an appointment with another person or a company. Thepersonal agent in turn creates a dedicated agent for each task it is given. For example, a Job SearchAgent (JSA) might communicate with Market Advisor Agents to locate good places to look for jobs. AJob Search Agent may also interact with a Company Agent to get more information about a position.Maintaining privacy protection as the agents share information and make autonomous decisions is thechallenge of the PISA project.

3.1 Privacy Guidelines and LegislationPrivacy can be protected in regulatory environments such as the EU and in self-regulatory environmentssuch as the US. In a self-regulatory environment there is an implied behavioral model of privacyprotection. That is, market mechanisms are most important and the behavior of the various players willbe determined by their motivations to succeed in a market, such as maximizing profit and minimizinguser complaints. In the regulatory environment the model is one of compliance, although market factorsmay also play a role. The approach taken in this paper is to focus on the EU Privacy Directive and tofacilitate "usable compliance" through privacy-enhanced interface design. This does not mean that theprinciples and recommendations outlined here will not be valuable in a self-regulatory environmentbecause attention to privacy concerns and good system design will likely have rewarding results, even ifthey are not legally required (Kenny & Borking, 2002).

3.2 Methods for Privacy ProtectionApproaches to privacy protection are generally grouped into two sets. The first set consists of promise-based approaches based on assurance. The (perceived) privacy provided by these approaches is basedon the credibility of the assurance, which is generally a product of the perceived corporate reputation ofthe data handler or auditor. As can be seen from recent US corporate events, reputation in this sense canbe highly transitory. Practical applications of this promise-based approach to privacy are COBIT-basedauditing approaches and Platform for Privacy Preferences (P3P) assurance services. The essence ofthese approaches is that identity is revealed and personal data is transferred on the reassurance that theenvironment is trusted.

The second set of approaches to privacy protection is based on self-determined anonymity. Here theuser's identity is protected, although most approaches eventually involve at least one other party otherthan the user knowing the real identity. That is, in both practical and legal senses, actual anonymity isdifficult or potentially impossible to achieve. Examples of this approach at the network level are mixnets, "anonymizers", and onion routing. The problem with anonymity approaches is that manyadvanced services, such as WWW portals, group discussions, or job-searching assistance, requirepersonally identifiable information about the user in order to be effective. Usability, interoperability,and scalability challenges are also significant with these approaches. The result is that promise-basedapproaches to privacy protection are likely to dominate the privacy arena for some time to come. Thismeans that users must rely on the assurances provided by the system operators, and any guidanceregarding providing effective, believable assurance is especially important.

3.3 Related WorkAlfred Kobsa (2001, 2002) has recently conducted analyses with goals similar to the current project.Kobsa is interested in personalization services, such as WWW sites that remember your name and

Page 28: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 28 / 139

preferences. Such personalized services are made possible because the sites collect personalinformation about the users, either explicitly by asking for the information, or implicitly by trackingusage patterns. Although the personalized services can be useful and valuable, the storage and use ofpersonal information both worries some users, and falls under the auspices of privacy guidelines andlegislation. Kobsa has examined the implications of the privacy laws and user concerns and developeddesign guidelines to help WWW site operators build privacy-sensitive systems. These guidelinesinclude suggestions like: (1) inform users that personalization is taking place, and describe the data thatis being stored and the purpose of the storage, (2) get users' consent to the personalization, and (3)protect users' data with strong security measures. The current analysis goes deeper to focus on therequirements necessary when complying with the European Privacy Directive, and includes a discussionof specific interface techniques that can be used to meet those requirements.

3.4 Privacy Principles

3.4.1 EU Legislation

The right to privacy in the EU is defined as a human right under Article 8 of the 1950 EuropeanConvention of European Human Rights. The key privacy document is Directive 95/46/EC of theEuropean Parliament and of the Council of 24 October 1995 on the protection of individuals with regardto the processing of personal data, and the free movement of such data (hereafter referred to as TheDirective) (European Commission, 1995). Also, Directive 97/66/EC (European Commission, 1997) ofthe European Parliament and of the Council concerning the processing of personal data and theprotection of privacy in the telecommunications sector applies and strengthens the original directive inthe context of data traffic flow over public networks. These two directives represent the implementationof the human right to privacy within the EU.

The Directive places an obligation on member states to ratify national laws that implement therequirements of The Directive. This has resulted in, for instance, Wet Bescherming Persoonsgegevens1999 in The Netherlands and The Data Protection Act 1998 in the UK. The national legislatures of EUmember states must implement The Directive to substantially similar degrees. Such implementationincludes sanctioning national enforcement bodies such as the Dutch Data Protection Authority withprosecutory powers.

The Directive defines a set of rights accruing to individuals concerning personal data (also known asPersonally Identifiable Information, or PII), with some special exceptions, and lays out rules of lawfulprocessing on the part of users of that information that are applicable irrespective of the sector ofapplication. Specifically, The Directive specifies the data protection rights afforded to citizens or "datasubjects", plus the requirements and responsibilities of “data controllers” and by association “dataprocessors”. The Directive attempts to balance the fundamental right to privacy against the legitimateinterests of data controllers and processors -- a distinctive and central characteristic of the EU approachto data protection. Each of the roles is described in Table 3.1.

Since The Directive in principle prohibits the processing of EU citizens' data in nations whose privacylaws are not as strong as those in the Union, an understanding was required between the EU and the US(Reidenberg, 2001). The result is a "Safe Harbor" arrangement in which US companies voluntarily self-certify that they fulfill the privacy requirements as stated in the Safe Harbor agreement. The effect isthat Safe Harbor signatories become equivalent to European processors.

Page 29: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 29 / 139

Table 3.1: Privacy Roles Defined by The Directive

Data Subject The Data Subject is a person who can be identified byreference to one or more factors specific to his or herphysical, physiological, mental, economic, cultural orsocial identity. Even data associated to an individualin ambiguous ways may be deemed reasonablyidentifiable given a reasonable projection oftechnological development. Further, followingArticle 1 of the Council of Europe Convention108/81 (Council of Europe, 1981), the fundamentalright to data protection applies not because of thenationality of the data subject, but as a result of aController or Processor operating in the EU.

Controller The Controller is the custodian of the data subject’sdata and the party who determines the purpose andmeans of processing personal data. The Controller isdefined as the holder of ultimate accountability as itrelates to the correct processing of the subject'spersonal data. Though an organizational entity isitself legally accountable, in reality those actuallyresponsible for assuring correct processing are thoseoperating at the governance level, and frequently thisis a company’s board of directors.

Processor The Processor is the entity that processes personaldata on behalf of the Controller where the Controllerdetermined that this is required. The Processor isaccountable to the Controller, not to the Data Subject.

3.4.2 Overview of the Resulting Principles

As The Directive concerns itself with data processing, it must be implemented through a combination ofinformation technology and governance initiatives. Privacy principles abstracted from the complexitiesof legal code have been developed to simplify this process. Table 3.2 shows a high-level summary ofthe privacy principles. Our research has focused on the privacy principles of (1) transparency, (2)finality and purpose limitation, (3) lawful basis, and (4) rights because these principles have the mostimportant implications for user interface design. The remainder of this paper will be restricted to thesefour privacy principles.

3.5 HCI RequirementsThe principles shown in Table 3.2 have HCI implications because they describe mental processes andbehaviors that the Data Subject must experience in order for a service to adhere to the principles. Forexample, the principles require that users understand the transparency options, are aware of when theycan be used, and are able to control how their PII is handled. These requirements are related to mentalprocesses and human behavior, and HCI techniques are available to satisfy these requirements. Forexample, an HCI specialist might examine methods for ensuring that users understand a concept, suchas providing documentation, tutorials, and interface design characteristics.

Page 30: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 30 / 139

Table 3.2: High-Level Summary of Privacy Principles

(bold items are analyzed in detail)

Principle DescriptionReporting theprocessing

All non-exempt processing must be reported in advance to the NationalData Protection Authority.

Transparentprocessing

The Data Subject must be able to see who is processing his personaldata and for what purpose. The Controller must keep track of allprocessing performed by it and the data Processors and make itavailable to the user.

Finality & PurposeLimitation

Personal data may only be collected for specific, explicit, legitimatepurposes and not further processed in a way that is incompatiblewith those purposes.

Lawful basis fordata processing

Personal data processing must be based on what is legally specifiedfor the type of data involved, which varies depending on the type ofpersonal data.

Data quality Personal data must be as correct and as accurate as possible. TheController must allow the citizen to examine and modify all dataattributable to that person.

Rights The Data Subject has the right to acknowledge and to improve theirdata as well as the right to raise certain objections.

Data traffic outsideEU

Exchange of personal data to a country outside the EU is permitted onlyif that country offers adequate protection. If personal data is distributedoutside the EU then the Controller ensures appropriate measures in thatlocality.

Processor processing If data processing is outsourced from Controller to Processor,controllability must be arranged.

Security Protection against loss and unlawful processing

Table 3.3 presents a more detailed summary of the four privacy principles under consideration in thispaper. Included in Table 3.3 are the HCI requirements that have been derived from the principles.These requirements specify the mental processes and behavior of the end user that must be supported inorder to adhere to the principle. For example, the principle related to the processing of transparencyleads to a requirement that users know who is processing their data, and for what purpose.

The HCI requirements outlined in Table 3.3 are not unrelated. The core concepts in the requirementscan be grouped into four categories: (1) comprehension: to understand, or know; (2) consciousness: beaware, or informed; (3) control: to manipulate, or be empowered; (4) consent: to agree.

Page 31: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 31 / 139

Table 3.3: Privacy Principles, HCI Requirements, and Design SolutionsPrivacy Principle HCI Requirement Possible Solution

1 Transparency: Transparency is where a Data Subject(DS) is empowered to comprehend the nature ofprocessing applied to her personal data.

users must be aware of thetransparency options, and feelempowered to comprehendand control how theirPersonally IdentifiableInformation (PII) is handled

during registration, transparency information isexplained and examples or tutorials are provided

1.1 DS informed: DS is aware of transparencyopportunities

users must be aware of thetransparency options

Opportunity to track controller's actions madeclearly visible in the interface design

1.1.1 For: PII collected from DS. Prior to PII capture: DSinformed of: controller Identity (ID) and PurposeSpecification (PS)

users know who is controllingtheir data, and for whatpurpose(s)

at registration, user is informed of identity ofcontroller, processing purpose, etc.

1.1.2 For: PII not collected from DS but from controller.DS informed by controller of: processor ID and PS. IfDS is not informed of processing, one of the followingmust be true: DS received prior processingnotification, PS is legal regulation, PS is security ofthe state, PS is prevention/detection/prosecution ofcriminal offences, PS is economic interests of thestate, PS is protection of DS or rights of other naturalpersons, PS is scientific/statistical & PII isanonymized, or PII are subject to any other lawgoverning their processing/storage

users are informed of eachprocessor who processes theirdata, and the users understandthe limits to this informing

- user agreements states that PII can be passedon to third parties- user agreement also contains information aboutusage tracking limitations- when viewing the processing logs, entries withlimited information are coded to draw attention,and users are reminded about the trackinglimitations

1.3 When PII are used for direct marketing purposes, DSreceives notification of possible objection. Thisnotification may occur every 30 days

users understand that theycan object to processing oftheir PII for direct marketing,and the limitations on thoseobjections

- during registration, users must opt-in toprocessing for direct marketing or charitablepurposes- to ensure understanding and awareness, usersare given examples and a Just-In-Time Click-Through Agreement (JITCTA) is used for finalacceptance- users are also reminded of their opt-in/outoption in a preferences interface screen

2 Finality & Purpose Limitation: the use and retentionof PII is bound to the purpose to which it wascollected from the DS.

users control the use andstorage of their PII

interface elements for making privacy decisionsare prominent and obvious

2.1 The controller has legitimate grounds for processingthe PII (see Principle 3.1)

users give implicit or explicitconsent

click-through agreement should obtainunambiguous consent for controller to processthe PII

2.2 Obligations: A controller must process according tohis PS, controller also ensures other processors presenta PS to be considered a recipient of the PII. Whenassessing a processor, the controller considers PIIsensitivity and the similarity of processor PS toagreed-upon PS and location of the processor. Theprocessor can only go beyond the agreed PS if: theprocessor's PS is state security, orprevention/detection/prosecution of criminal offences,or economic interests of the state, or protection of DS,or rights of other natural persons, orscientific/statistical analysis

users understand that their PIIcould be used for otherpurposes in special cases

- user agreements states that PII can (must) bepassed on in special cases- when viewing the processing logs, entries withlimited information are coded to draw attention,and users are reminded about the special cases

2.3 Retention: the DS is to be presented a proposedretention period (RP) prior to giving consent, exceptwhere PS is scientific/ statistical. Controller ensuresprocessor complies with RP, except where PS isscientific/statistical. When RP expires, it is preferablydeleted or made anonymous. A record should be keptof processor's and controller's past adherence to RPs.

- users are conscious of RPprior to giving consent- users are aware of whathappens to their data when theretention time expires

- When data is provided, a retention period entryfield will be highlighted- Users are informed when information isdeleted or made anonymous because of retentionperiod expiry.

3 Legitimate Processing: Legitimate Processing (LP) iswhere the PII is processed within defined boundaries.

users control the boundaries inwhich their PII is processed

interface elements for making privacy decisionsare prominent and obvious

Page 32: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 32 / 139

Privacy Principle HCI Requirement Possible Solution

3.1 Permission: To legitimately process PII, controllerensures that one or more of the following are true: theDS gives his explicit consent, the DS unambiguouslyrequests a service requiring performance of a contract,the PS is legal obligation or public administration, thevital interests of the DS are at stake. When matchingthe PS agree to by the DS and the PS of the possibleprocessor, any of the following will preventprocessing: The controller/processor's actual PSdiffers from the PS consented to by the DS, thecontroller/processor intends passing the PII to a newprocessor, the controller/processor is not located in theEU, or the processor is violating a fundamental rightto be left alone

- users give informed consentto all processing of data- users understand when theyare forming a contract forservices, and the implicationsof that contract- users understand the specialcases when their data may beprocessed without a contract

- JITCTA to confirm unambiguous consent todata processing- JITCTA to confirm the formation of acontract, and the implications/limitations of thecontract- in the tracking interface, include a reminder ofspecial cases when data can be processedwithout a contract

3.2 Sensitive Data: The controller may not process anyPII that is categorized as religion, philosophicalbeliefs, race, political opinions, health, sex life, tradeunion membership, or criminal convictions unless theDS has given their explicit consent or the processor isacting under a legal obligation

when dealing with highlysensitive information (religion,race, etc.), users provideexplicit, informed consentprior to processing

if sensitive information is provided by the user,use a double JITCTA to obtain unambiguousconsent for its processing

4 Rights: DS has the right to self-determination withinthe boundaries and balance of The Directive.

users understand and canexercise their rights

- at registration, use a click-through agreementto ensure that users know their rights- interface layout provides obvious tools forcontrolling the rights functions

4.1 Access: DS is conscious of her rights. The DS hasright to retrieve this data on PII processing: (1) whohas received it; (2) who gave them it; (3) when; (4) forwhat PS & (5) if a delete or anonymize operation hasbeen acknowledged & authenticated. Items (1) (3) (4)should be disclosed if the proposed PS is any one of:state security, prevention/detection/prosecution ofcriminal offences, economic interests of the state,legal regulation, or protection of rights and freedoms(of other persons). If the DS is below the age ofconsent then access requests must be made by his/herlegal representative (LR). In all cases, authenticationshould be proportional to the PII sensitivity

- users are conscious of theirrights, which include right toknow who has received theirdata, from whom, when, andwhy, and they understand theexceptions to these rights- users understand and canexercise their rights

- the tracking functions are displayedprominently- the exceptions to the rights are presented in theuser agreement, and reminders are provided inthe tracking interface

4.2 Control: DS may issue erase, block, rectify, orsupplement commands on their PII. The DS isinformed of the result of their command within 30days. The communication is either: request acceptedand executed, or request denied and an explanation. Ifthe PII will not be editable due to the storage strategyapplied, then DS is informed & asked to consent priorto providing any PII. Controller is accountable for thecorrect execution of DS requests for erase, block,rectify, or supplement the PII

- users are conscious of theirrights, they can exercisecontrol over their data, whichability to erase, block, rectify,or supplement the data- users are informed whendata will not be editable andthey provide consent toprocessing

- the tracking functions are displayedprominently- the exceptions to the rights are presented in theuser agreement, and reminders are provided inthe tracking interface- the commands to erase, block, rectify, andsupplement are associated with the tracking logsand obvious to operate- a JITCTA is used when data will not beeditable

4.3 Objections: if DS has not given direct consent toprocessing and the PS is public administrative orLegitimate Processing, the controller determinesvalidity of the objection. If the PII is sensitive dataand/or the PS is sensitive then the objection isaccepted and the PII is deleted. If the PS is directmarketing then any objection is accepted and the PII isdeleted.

users are empowered to objectto processing for certainpurposes

the tracking logs contain a prominent functionto object to the processing

4.4 Derived Information: Certain PS supplied byprocessor to controller or controller to DS could beused to gain an insight into a person's personality, e.g.,services of interest to the DS. This derivedinformation shall not be processed unless: the DS isinformed of the PS related to the derived information,he/she unambiguously requests a service requiringperformance of a contract and has issued explicitconsent. The DS can object to the processing of thederived information at any time, and the derivedinformation must be deleted.

users understand and areinformed that their behaviormay provide someinformation, and they haveprovided consent for theprocessing of this information.They are also empowered toobject to this processing

- the concept of derived information isexplained at registration, and an example isprovided- a JITCTA is used to confirm consent toprocessing- processing logs or other results of derivedinformation are always presented with anobvious interface for objection

Page 33: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 33 / 139

In the category of comprehension, the requirements can be summarized as building a system or servicethat will enable users to:

• comprehend how PII is handled• know who is processing PII and for what purposes• understand the limits of processing transparency• understand the limitations on objecting to processing• be truly informed when giving consent to processing• comprehend when a contract is being formed and its implications• understand data protection rights and limitations

In the category of consciousness, the requirements are to allow users to: • be aware of transparency options• be informed when PII is processed• be aware of what happens to PII when retention periods expire• be conscious of rights to examine and modify PII• be aware when information may be collected automatically

In the category of control, the requirements are to allow users to: • control how PII is handled• be able to object to processing• control how long PII is stored• be able to exercise the rights to examine and correct PII

Finally, the requirements in the area of consent are to build systems that allow users to: • give informed consent to the processing of PII• give explicit consent for a Controller to perform the services being contracted for• give specific, unambiguous consent to the processing of sensitive data• give special consent when information will not be editable • consent to the automatic collection and processing of information

This list represents the essential HCI requirements that must be met in order to build systems thatprovide usable compliance with the European Privacy Directive. System designers will be well servedif they consider the dimensions of comprehension, consciousness, control and consent when buildingprivacy-enhanced systems.

3.6 Interface Methods to Meet RequirementsThe field of interface design has developed a set of techniques, concepts, and heuristics that addresseach of the requirement areas. It is beyond the scope of this paper to provide an exhaustive review ofthe field of interface design, and interested readers are encouraged to examine one of the many HCIbooks for more information (e.g., Nielsen, 1993; Norman, 1988; Preece et al., 1994; Shneiderman,1987; Wickens & Hollands, 2000).

3.6.1 Comprehension

The obvious method to support comprehension or understanding is training. Users can be taughtconcepts and ideas through classroom training, manuals, demonstrations, etc. Such methods can be verysuccessful, but they can also be expensive, time-consuming, and inappropriate when learning computersystems that will be infrequently used. Today, much effort is devoted to supporting comprehensionwithout resorting to formal training methods.

Page 34: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 34 / 139

User documentation, especially online or embedded documentation, is often used as a replacement fortraining. Most computers and software come with manuals of some sort, and much is known about howto develop material that people can learn from effectively (Nielsen, 1993). Studies have shown,however, that most users do not read the documentation, and often they cannot even find the printedmanuals (Comstock & Clemens, 1987). As a result, designers often resort to tutorials and help systemsto support comprehension. Help systems can be designed to provide short, targeted informationdepending on the context, and such systems can be very powerful. It is often difficult, however, to learnan overview of all the features of a system using built-in help. Tutorials are another method ofsupporting learning, and they can work well if they are designed with a good understanding of the needsof the user.

There are other methods for supporting understanding that do not rely on documentation. For example,research in cognitive psychology has shown that users often develop personal "mental models" ofcomplex systems. These models are attempts to understand something to a level where it can be usedeffectively, and such models can be quite effective when faced with complex systems. HCI specialistscan exploit the human tendency to create models by either guiding users to develop appropriate models,or by examining the models that already exist and accounting for them. For example, people often havea mental model of a furnace thermostat that is analogous to a water faucet. That is, the more that it is"turned on", the faster the water (or heat) will flow. This model is incorrect because most furnaces canonly operate at one flow rate and the thermostat only determines the temperature where the heat flowwill be shut off. It is interesting to note that this erroneous mental model has persisted for a long time,and thermostat interface designers would likely want to take it into account. Thus, a thermostat designermight add a feature to automatically return the setting to a normal room temperature some time after thethermostat was suddenly turned to an abnormally high setting.

A related interface technique is the use of metaphors. Most modern graphical computer systems arebased on a desktop or office metaphor, where documents can be moved around a surface, filed infolders, or thrown in a trashcan. The graphical elements of the interface, such as document icons thatlook like pieces of paper and subdirectory icons that look like file folders, reinforce this metaphor. Themetaphor is valuable because it provides an environment that users are familiar with, and thus they canuse familiar concepts and operations when interacting with the system. The familiar metaphordecreases the need to develop new knowledge and understanding.

There are other, more subtle techniques that can facilitate comprehension. For example, the layout ofitems on the screen can convey some meaning or information. Items that are grouped together visuallywill likely be considered to be group together conceptually (Nielsen, 1993), and interface designers cantake advantage of that. Also, items that are ordered horizontally in a display will likely be examinedfrom left to right, at least in North American and European cultures. Interface designers can use thissequencing tendency to ensure that users follow the recommended sequence of operations.

Feedback is also very important for supporting understanding (Wickens & Hollands, 2000). Mostcomplex systems require some experience and learning before they can be used effectively. Withoutfeedback, users may not learn the consequences of their actions and understanding will be slow todevelop.

Page 35: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 35 / 139

3.6.2 Consciousness

The requirement of consciousness refers to the user being aware of, or paying attention to, some conceptor feature at the desired time. It is related to comprehension because the awareness may require somebackground knowledge before conscious attention is useful. Consciousness in this context can bethought of as bringing knowledge or understanding to the attention of the user so it can be used whenrequired.

There are many interface techniques for making users aware of something. System messages or pop-upwindows are an obvious technique for making the user aware of something. For important information,these windows can be constructed so the users have to acknowledge the message before they cancontinue using the system. A more subtle technique is to remind the user of something, withoutinterrupting their work. This is sometimes seen in "help assistants" (such as the Microsoft OfficeAssistant) that make suggestions while users interact with the interface. Another way to remind users isthrough the arrangement of the interface. For example, if a particular option is available to a user at acertain time, placing icons or messages nearby in the interface layout can ensure that users are aware ofthe options.

Even more subtle methods use display characteristics to draw attention. Printing text in a certain color,such as red, can draw attention. Changing the color dynamically can be more effective. Sounds are alsofrequently used to make users aware of some event. The human factors discipline has a long history ofdesigning systems that make users aware of certain things at certain times (Wickens & Hollands, 2000).

3.6.3 Control

Control refers to the ability of the user to perform some behavior. Control is related to comprehensionbecause the user must understand the task and context to behave effectively. Control is also related toconsciousness because users must be aware of the need to act before they can execute the behavior. Theissue of control, however, is that once the user knows that they are supposed to do something(awareness), and they understand what to do (comprehension), can they actually carry out the action.

An important concept for ensuring control is affordance, which means to provide naturally or inevitably.The classic example is door opener design. With some doors, users may approach the door, understandthat it is a door, be conscious that they need to open the door, and still not be able to perform the action(see Figure 3.2 for an example). In contrast, a simple metal plate placed on the surface of the door tendsto be a natural signal to push the door (in fact, these are often called "push plates"), whereas a metalloop placed vertically at the edge of a door tends to be a natural signal to pull the door. By usingaffordances, interface designers can make the door easy to control.

Another interface technique that supports appropriate actions is mapping. The idea is to map theappearance and function of the interface to the device being controlled. This might mean making aphysical analogy of the real world in the interface, such as arranging light switches on a wall in the sameorder that the lights are arranged in the ceiling (Norman, 1988).

Page 36: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 36 / 139

Figure 3.2: A door with poor affordances. The door is solid glass with a vertical handle in the middle.

(from http://www.baddesigns.com; reprinted with permission)

Many of the subtle HCI techniques that can be used to support control are related to "obviousness". Tothe extent that the interface can be made obvious to the user, control (and understanding) can be smoothand effective. When interfaces are not obvious, users may have serious problems using the device orsystem (see http://www.baddesigns.com for some amusing examples of non-obvious designs). The goalof the interface designer is to build something that is so obvious to the user that comprehension,consciousness, and control will develop with little learning and effort.

3.6.4 Consent

The final HCI requirement category is consent. Users must be able to consent or agree to terms orconditions that may be associated with a system or service. Moreover, the consent should be"informed", meaning that the users fully understand what they are agreeing to, and what implicationsthis may have. Obviously, supporting informed consent is related to the requirements forcomprehension and consciousness.

The most common method for supporting consent in computer applications is a "user agreement".When you have installed new software on your computer, or signed-up for an Internet service, you haveundoubtedly seen an interface screen that presents a User Agreement or Terms of Service. In order tocontinue, you have had to click on an "I Agree" button or an equivalent label. These interface screensare commonly called "click-through agreements" because the users must click through the screen to getto the software or service being offered (Thornburgh, 2001). (An alternative label is "click-wrapagreement", in parallel to more traditional "shrink-wrap" agreements attached to software packaging.)These agreement screens are an attempt to provide the electronic equivalent of a signed user agreementor service contract (Slade, 1999). By clicking on the "Agree" button, the user is confirming theirunderstanding of the agreement and indicating consent to any terms or conditions specified in theaccompanying text.

The legality of these click-through screens in forming the basis of a legal agreement or contract hasbeen established, but with some qualifications. The Cyberspace Law Committee of the American Bar

Page 37: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 37 / 139

Association has recently reviewed the case law and developed a set of guidelines for creating click-through agreements (Kunz, 2002). These guidelines have been summarized into six principles to beconsidered by system developers, and these are listed in Table 3.4 (Halket & Cosgrove, 2002;Thornburgh, 2001).

Table 3.4: Guidelines for Creating Click-Through Agreements

1. Opportunity to review terms: users must view the terms of the agreementbefore consenting to the agreement. A recent case involving Netscape(Thornburgh, 2001) established that it is important that there be no othermethod to obtain the product or service other than by clicking-through theagreement.

2. Display of terms: the terms have to be displayed in a "reasonablyconspicuous" (Thornburgh, 2001) manner. A recent case involvingTicketmaster (Kunz, 2002) established that simply linking to the terms at theend of a long home page was not enough.

3. Assent to terms: the language used to accept the agreement must clearlyindicate that a contract is being formed.

4. Opportunity to correct errors: there should be a method for users to correcterrors, such as seeking a final confirmation before proceeding, or allowingthe user to back-out of an agreement.

5. Ability to reject terms: the option to reject the terms of the agreement shouldbe clear and unambiguous, and the consequences of the rejection should bestated (e.g., "if you do not agree, you will not be able to install thissoftware").

6. Ability to print the terms: the interface should allow the user to print the termsfor later reading.

Other factors that should be considered when creating click-through agreements (Slade, 1999) are toredisplay the terms and conditions at product startup (reminding), and to support the ability to reviewthe terms at any time (e.g., in the "help" or "about" menus). In addition, developers should adapt theterms and conditions to local languages and requirements. If these principles and considerations areheeded, case law suggests that click-through agreements will likely be enforced, at least in US courts.(Some jurisdictions, such as Germany and China, are unlikely to enforce any of these agreements;Slade, 1999).

The text of many click-through agreements tends to be long and complex, often to ensure that all thepoints raised above are addressed. The result is that many users have difficulty reading andunderstanding the documents (a comprehension problem), and many users click the "Agree" buttonwithout considering the terms at all (a consciousness problem). The problems arise because people havelimited cognitive capacity: we have limited attention spans, a restricted ability to process largequantities of detailed information at one time, and limited memories. Thus, using interface techniquesthat are sensitive to user characteristics is important. This observation may be particularly relevant ifusers are being asked to agree to a number of terms that will affect them substantially, such as theprocessing of their personal data.

Page 38: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 38 / 139

Ensuring that users fully understand and unambiguously agree to the processing of their personalinformation is important for complying with privacy legislation and guidelines. Consider the definitionof consent provided in the EU Directive 95/46/EC on privacy protection (European Commission, 1995):

'the data subject's [user's] consent' shall mean any freely given specific and informedindication of his wishes by which the data subject signifies his agreement to personal datarelating to him being processed. (Article 2-h)

It is clear that a large, cumbersome, complicated User Agreement presented to the user only when theybegin to use a product or service fails to live-up to the requirements for "specific" and "informed"consent, and yet these types of user agreements are the majority. These issues are of particular concernin relation to explicit consent. For example, the EU Directive states that when sensitive data (e.g., race,ethnic origin, religious beliefs) are processed, the user must give "explicit consent" (Article 8-2-a) to theprocessing of the sensitive data. Again, a single, large, click-through User Agreement does not meet thespirit of The Directive.

The solution to this problem proposed here is a new concept of "Just-In-Time Click-ThroughAgreements" (JITCTAs). The main feature of a JITCTA is not to provide a large, complete list ofservice terms but instead to confirm the understanding or consent on an as-needed basis. These smallagreements are easier for the user to read and process, and facilitate a better understanding of thedecision being made in-context. Also, the JITCTAs can be customized for the user depending on thefeatures that they actually use, and the user will be able to specify what terms they agree with, and thosethey do not. The responses made by the user during the JITCTAs can also be recorded so there is aclear, unambiguous record of the specific agreements made with the user. In order to implementJITCTAs, the software will have to recognize when users are about to use a service or feature thatrequires that they understand and agree to some term or condition.

A sample screen capture of a JITCTA is shown in Figure 3.3. In this example a user has selected theTrade Union Membership information field in the Create Agent interface screen of the PISA interface.Since this would be considered sensitive information in the EU Privacy Directive, a JITCTA hasappeared to obtain explicit, specific, timely, unambiguous consent to the processing of this data.

In summary, well-formulated click-through agreements are legally permissible in many countries, andJust-In-Time Click Through Agreements improve on this device by supporting more appropriatedecision-making and control that is sensitive to human factors constraints.

3.7 Summary and ConclusionsThis chapter introduced design guidance for privacy-enhancing technologies from a human factors pointof view. For the first time, this work specified what must be included in human-computer interfaces tosatisfy the spirit of European privacy legislation and principles, and satisfy the privacy needs of theusers ("usable compliance").

The current work has focused on European privacy legislation and, although the resulting principles,requirements, and solutions are general, one of the challenges that remains is to ensure that theknowledge is equally applicable in other legislative settings, such as Canada, and in areas operating in aself-regulatory fashion (e.g., the USA). For example, it is possible that the market forces operating inthe USA will lead to privacy requirements and expectations that have not been anticipated. Even in

Page 39: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 39 / 139

regulated environments, the privacy legislation and guidelines will change and evolve, and thus thehuman interface guidelines will also have to be dynamic.

Figure 3.3: An example of a Just-In-Time Click-Through Agreement (JITCTA).

Privacy enhancing technologies are also evolving and changing, and this will have an effect on the typesof solutions that are available, and also the privacy needs and expectations of the users. For example,the P3P protocol, if implemented widely, may have a profound effect on the privacy domain by bringingprivacy issues to the attention of millions of Internet users, and hopefully providing an easy-to-useprivacy control interface (e.g., Cranor, Arjula, & Guduru, 2002).

Page 40: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 40 / 139

4. PISA System Design RequirementsThe previous chapters have described research results, guidelines, and recommendations for buildingtrustworthy agent systems and supporting usabile compliance with privacy legislation and principles. Inthis chapter specific system design requirements for the PISA Demonstrator are itemized. This chapterrepresents the design requirements that are being used when developing the PISA Demonstrator.

4.1 Trustworthy Interface Design RequirementsThis section lists the design requirements associated with building trustworthy interfaces. Included arefeatures that should be present in the interface, supporting information that is required (i.e., help,documentation), overall system characteristics and capabilities, and system performance issues.

Table 4.1: PISA Interface Feature Requirements

1. should use principles of visible affordances2. should display logos of reassurance (e.g., TRUSTe, Global Sign)3. must ensure that navigation is easy, obvious, intuitive4. must ensure that it is easy to get the tasks done (e.g., program agent, launch

agent, retrieve results)5. should provide a clean, functional layout6. must use consistent look and feel for all functions7. should present a professional, conservative appearance8. should use high quality graphics where appropriate9. should use cool, pastel, low brightness colors and shades10. should use symmetrical display design where possible11. must display continuous status information12. must echo information and instructions after they are entered to confirm it is

correct before execution. This should show how the agent interpreted theinstructions.

13. should avoid anthropomorphism14. must only collect information necessary for the task15. must provide detailed controls for sharing of information (e.g., share

education freely, do not share name without prior approval)16. must provide detailed controls for the actions allowed (e.g., report jobs of

interest, apply for jobs of interest, setup interviews)17. must provide an accurate representation of the abilities and limitations of the

agent18. should provide method for users to share experiences19. must provide a method for users to examine and change all information

carried by agents20. must provide feedback that the changed information has been distributed

(versioning)21. should provide a method to recall information already shared with other

agents

Page 41: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 41 / 139

Table 4.2: PISA Supporting Information Requirements

1. must provide explicit privacy and security statements2. should mention brands or known services associated with the demonstrator

(e.g., Dutch Data Protection Authority, TNO, TU Delft)3. should show dialogue with appropriate data protection agency to confirm

registration of data handling activities4. should provide recommendations from trusted sources5. must provide a model of the system operation6. must describe the measures being taken to reduce risk (e.g., encryption,

anonymity)7. must describe the privacy enhancing technologies being used8. should provide information that is customizable for different types of user

(e.g., more or less technical details)

Table 4.3: PISA Overall System Characteristic Requirements

1. must provide secure methods for authentication and authorization2. must provide secure transmission of all data (encrypted payloads)3. must provide continuous, real-time status information to interface4. must support tracking of all agent actions during operation and after the fact5. should support stop, recall, and "undo" functions where possible (e.g., retract

job application)6. should support trial runs with "dummy" data so users can experience the agent

before taking risks7. must provide quick responses (less than 2 seconds)8. must appear to be stable9. must warn users of all interruptions, down-times, etc.

Page 42: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 42 / 139

4.2 Usable Compliance Design RequirementsThe second set of requirments comes from the analysis of privacy legislation and principles. Theserequirements are organized around user tasks, and include registering and agreeing to service terms,creating a task, tracking a task, modifying a task, and dealing with system messages.

Table 4.4: Requirements for Use Case: Registering & Agreeing to Service Terms

1. Users may opt-in to processing for direct marketing or charitable purposes. Toensure understanding and awareness, users are given examples and aJITCTA is used for final acceptance.

2. The concept of derived information is explained at registration, and anexample is provided. A Just-In-Time Click-Through Agreement (JITCTA)is used to confirm consent to processing.

3. At personal agent creation, user is informed of identity of controller,processing purpose, etc.

4. The user agreements states that Personally Identifiable Information (PII) canbe passed on to third parties

5. The user agreement contains information about usage tracking limitations6. The user agreement will inform users that they will not be informed of

processing in some special cases7. The user agreement states that PII can (must) be passed on in special cases,

such as a legal obligation or a request related to public administration (e.g.,law enforcement)

Table 4.5: Requirements for Use Case: Creating a Job Search Task

1. Interface should present and obtain unambiguous consent to processing for thepurposes stated by the Job Search Agent (JSA). The consent agreement willclearly state that the user is entering intro a contract with the PISA system.This will be a JITCTA.

2. During agent creation, a retention period for the PII must be collected andused.

3. If sensitive information is provided by the user, use a double JITCTA to obtainspecific, unambiguous consent for its processing

4. During agent creation, info is collected if the user is under 16 years of age. Ifthe user is under 16 then they must provide a legal representative. Thisinformation is retained by the JSA and used to determine who can object toprocessing (the user or their representative).

5. When current agents are listed, opportunity to track actions is shown.

Page 43: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 43 / 139

Table 4.6: Requirements for Use Case: Tracking a Task

1. A real-time tracking window will be shown to provide timely information anda feeling of control

2. Log entries with limited information are color coded to draw attention, andusers are reminded about the tracking limitations

3. Processing by the same agent only needs to be logged once4. The tracking interface will remind users that they will not be informed of

processing in some special cases5. Users are informed when information is deleted or made anonymous because

of retention period expiry6. When viewing the logs, users can access detailed information about each

processor7. When the details of a log entry are viewed, the interface should indicate when

full information is not being provided by a processor, and the grounds forwithholding the information

8. The tracking logs contain a prominent function to object to the processing9. Users will receive confirmation that PII has been deleted if an objection is

successful10. Processing that is the result of derived information is also presented with an

obvious interface for objecting

Table 4.7: Requirements for Use Case: Modify a Task

1. The interface will prominently display controls to change, erase, etc.2. Users are reminded of their ability to opt-in/out of processing for marketing

purposes 3. User can ask to view and modify data held by the JSA4. The results of all change requests are shown to the user in the tracking logs5. In the PISA case, all PII will be editable by the user

Table 4.8: Requirements for Use Case: Handling System Messages

1. If JSA has existed for 30 days and user has chosen to opt-in to marketingprocessing, interface should confirm this is still their choice

2. If a processing agent fails to obey a delete instruction, this information isrelayed to the user, stored and will be used to block further PII transfer tothat agent.

The above tables represent a summary of the requirements that were derived from the analysis of the EUPrivacy Directive. A more-detailed worksheet illustrating the analysis conducted for the PISA job-searching case is shown in Appendix 9.1.

Page 44: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 44 / 139

5. UML Modelling & Privacy Interface Analysis

The previous chapters have described the development of system requirements based on human-factorsprinciples and an analysis of privacy protection principles, and the resulting requirements have beenlisted in detail. The next step was to determine how these requirements could be met in the PISADemonstrator. Figure 5.1 shows a simple summary of the main user tasks that must be supported in thePISA Demonstrator. Although colorful, this diagram does not express the requirements and desiredfeatures in a clear manner that implementers (programmers) can work with. A relatively recenttechnique useful at this stage of a project is UML modelling, were requirements and desired behaviourscan be explicitly stated, visualized, and shared. This chapter describes how UML models were createdfrom a HCI point of view, and their role in the development of the PISA Demonstrator.

Figure 5.1: Illustration of the major PISA tasks or modules.

5.1 The Privacy Interface Analysis MethodologyThis section outlines how all of the knowledge of requirements and design techniques can be broughttogether to systematically conduct a Privacy Interface Analysis (see the right side of Figure 3.1 for aschematic summary of this analysis method).

Page 45: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 45 / 139

5.1.1 Develop a Service/Application Description

The first step in the analysis is to prepare a detailed description of the operation of the program orservice. A useful technique for conducting this analysis is the Unified Modeling Language (UML) [0],which is a powerful method for specifying, visualizing, and sharing specifications and design decisions.By creating a set of interrelated diagrams or models, the developers can visualize and examine thefeatures of the software long before any programming code is written. Although UML is not required tocomplete a thorough Privacy Interface Analysis, it does make the process easier and the result morevaluable.

A primary UML modeling technique is Use Case modeling. Here a high-level diagram is created toshow the functionality of the system from the users' point of view. The purpose of the Use Caseanalysis is to specify what the software will do, and not to focus on how it will do it (that will comelater). Figure 5.2 shows a simple Use Case diagram for the PISA Demonstrator example. Similar toFigure 5.1, this diagram shows the major functions provided by the software are creating an agent,tracking an agent, viewing agent results, etc. Doing a thorough analysis at this stage is importantbecause each use case represents a function or feature that may involve an interface to privacyprotection measures.

The next step is to determine how the application will work internally. UML structure diagrams areuseful here to illustrate the software objects or classes that will be necessary to implement thefunctionality of a use case. Perhaps most useful are interaction diagrams, such as Object SequenceDiagrams. These diagrams model the relations between the software objects, and illustrate any datacommunication that must take place. Figure 5.3 shows a sequence diagram for the Register use case inthe PISA demonstrator example. This diagram depicts the major software components involved withsupporting this function, such as the WWW interface, the WWW server, and the Personal Agent. It alsoshows the interactions between the user and the system, as well as the interactions between the softwareobjects. Normally you should create at least one Object Sequence diagram for each use case that wasidentified earlier.

Figure 5.2: Use Case Diagram for the PISA Demonstrator.

Page 46: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 46 / 139

5.1.2 Explore and Resolve the HCI Requirements

The third step involves analyzing the HCI requirements devloped in Chapters 2 and 3 to determine theirimplications on the application being modelled. For each requirement, determine if solution is alreadyincluded in the current models of the application, or if a new solution is required. If a solution isneeded, generic possible solutions to the HCI requirements are presented in the last column of Table3.3, but each application may require a unique solution that is suitable for that particular situation. Forexample, privacy Principle 1.3.1 concerns processing for direct marketing purposes, and states that: "DSreceives notification of possible objection". Applied to the PISA example, this means that users need tobe made aware that they are able to object to processing of personal data for direct marketing purposes(the comprehension and consciousness requirement categories). One method to satisfy this requirementwould be to include an "opt-in" feature in the Create Agent use case so users can choose to participate indirect marketing or not, and to display that option in a distinctive color to draw attention to it. Inaddition, a "review options" function might be added to the Modify Agent use case to remind users thatthey can view and change their opt-in decision. Also, in the Track Agent use case, a control to changetheir opt-in decision could be provided.

Figure 5.3: Object sequence diagram for the Register use case.

To further illustrate this step in the analysis, consider what must happen during the Create Agent usecase. A naive view might be that the user simply provides the system with personal information, andperhaps reads a user agreement. By examining the HCI requirements, this Create Agent function can beexpanded to ensure usable compliance with the privacy principles. For example, Principle 2.3 statesthat personal information must have an associated retention period, after which the data is deleted or

Page 47: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 47 / 139

rendered anonymous. To comply with this requirement, an interface feature to "specify retentionperiod" can be added to the Create Agent use case. Other features that should be included in the CreateAgent use case are:

• use a JITCTA to acknowledge rights• use a JITCTA to acknowledge the formation of a contract and to consent to PII processing• use a JITCTA if any sensitive information is collected• provide an interface to "opt-in" to processing for direct marketing purposes

Another example of the results of a privacy interface analysis is shown in Figure 5.3. Principle 1 statesthat the use and storage of PII must be transparent to the user. To meet that requirement, the interactiondiagrams were examined and extra interactions for the Register use case were added so informationabout the identity and purpose of the controller are conveyed to the user.

Another important HCI requirement is that users must understand their ability to track the processing oftheir PII, and be aware of any limitations. In the PISA example, a solution to this requirement is shownin Figure 5.4, which represents a possible Track Agent interface screen. This screen shows how a log ofagent information sharing could be displayed, and some log entries are highlighted to indicate thatlimited tracking information is available. In addition, users are reminded by the message at the bottomof the screen of the situations where activity may not have been logged at all. Another feature of theinterface is to place control buttons for the objection functionality alongside the appropriate log entries.Thus, by using the interface features of highlighting, reminding, and grouping, the privacy principlescan be implemented naturally and obviously.

The result of a well-conducted privacy interface analysis is a set of design solutions that will ensureusable compliance with the privacy principles. These can be organized according to the use cases thatare affected, as was done in Chapter 4.

Page 48: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 48 / 139

Figure 5.4: A possible track agent interface screen illustrating HCI solutions.

5.2 The PISA Privacy Interface AnalysisFollowing the method of a Privacy Interface Analysis, a detailed description of the major functions or"use cases" from an HCI point of view (HCI UML) was developed for the PISA Demonstrator. Thiswork involved creating and anotating a UML sequence diagram for each use case that was expected inthe PISA application. These diagrams were initially developed by the HCI researchers at NRC, andthen shared with all the PISA researchers and developers. Comments were exchanged and, as illustratedabove, some changes were made to the system definition as a result. Overall, the HCI UML diagramswere quite successful in making assumptions concrete and stimulating discussions within thedevelopment team. The PISA team went on to develop more complete UML diagrams for illustratingand discussing the internal operations of the agent platform and inter-agent communictions. The finalset of of HCI UML diagrams is shown in Appendix 9.2.

Page 49: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 49 / 139

6. The PISA Interface Prototype

In order to demonstrate the HCI design concepts developed for PISA, and kick-start the interface designportion of the project, a stand-alone interface demonstration was developed at NRC. The goal was todevelop a set of WWW pages and back-end applications that would demonstrate a "look and feel" forthe PISA demonstrator before the actual agent platform was available. This prototype as designed to beboth stand-alone, so it could be demonstrated and tested, and also modular and well-structured, so itcould be integrated into the final PISA demonstrator. These design goals have been met. The interfaceprototype can be demonstrated over the Internet, and interested people should contact Andrew Patrick([email protected]) for the WWW address.

The interface prototype was developed using manually-edited HTML code to facilitate fine-level controlof the interface appearance and behavior. In addition, using native HTML code, rather than a WWWauthoring tool, allowed for detailed documentation of the interface components and design concepts.Javascript was also used extensively to support dynamic and customizable WWW pages that reflectedthe individual usage of the system (e.g., whether a user had created an agent or not). Cascading StyleSheets (CSS) were also used to ensure that the entire prototype had a consistent look and behavior,which is an important design consideration when building trust.

In order to give the interface prototype realistic agent-like behaviors, browser cookies were used to keeptrack of the actions of individual users. This allowed the prototype to track users' privacy preferences,and actions that they took as they used the system. Also, a simulated agent-platform was developedusing the Perl programming language and the Common Gateway Inteface (CGI) specification. Thissimulated platform produces random agent-like actions that appear in the interface as inter-agentcommunications or requests to the users' for more information.

Each of the major components in the interface prototype corresponds to one of the HCI UML use casesdescribed earlier. A screen capture of the main navigation screen in the prototype is shown in Figure6.1. More screen captures and detailed documentation on the prototype components, highlighting someof the most important design features, is provided in Appendix 9.3.

Page 50: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 50 / 139

Figure 6.1: Screen capture of the main navigation screen in the interface prototype.

6.1 Satisfying the HCI Requirements in the Interface PrototypeThe PISA interface prototype implements all of the design requirements outlined in Chapter 4. Themost notable design features in the prototype are listed in Table 6.1.

6.2 Integration with the Main PISA DemonstratorThe interface concepts and page designs have been shared with the entire PISA team throughout theproject. As a result, the PISA Demonstrator has an appearance and behavior that is drawn from theinterface prototype. In fact, much of the HTML code used in the Demonstrator has been copied directlyfrom the interface prototype pages.

In the next work period the PISA team will be using a central source code control system fordevelopment of the Demonstrator which will allow different authors to work on different components ofthe system. This setup will mean that the the interface designs can be fully integrated into theDemonstrator. Some of the features that were required to make the interface prototype stand alone, suchas the use of cookies, will no longer be needed in the Demonstrator, but most of the work can beadapted and used. There will be some significant effort required in moving from a CGI server interfaceto the J2EE server used in the Demonstrator.

Page 51: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 51 / 139

Table 6.1: Notable HCI Design Feature in the Interface Prototype

1. security/trust measure are obvious (e.g., logos of assurance)2. there is a consistent visual design, an visual metaphors of characters3. there is a conservative appearance, with the background resembling a legal

certificate4. there is a functional layout organized around user tasks5. the interface design provides an overview of the functions, the ability to focus

and control individual components, and it provides detailed information ondemand

6. sequencing by layout is supported in the left-to-right ordering of "createagent", "modify agent", "track agent", etc.

7. the interface is designed to support embedded help8. where appropriate, the interface requires confirmation of actions, such as

agreeing to the processing of personal information9. user are reminded of their data protection rights, and controls for those rights

are prominent and obvious (e.g., objecting to processing)10. a double JITCTA is used for specially sensitive information (i.e., union

membership)11. pop-up windows are used to present interim results or to ask the user for

clarification12. obvious agent controls are included (start, stop, track, modify)13. controls are provided for setting, customizing, modifying privacy preferences

and controls (e.g., retention period)14. visual design is used to remind users of transparency limits (i.e., the agent

tracking logs are color-coded when complete transparency is not available)

Page 52: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 52 / 139

7. Usability Evaluation Plan

Planning has begun for usability evaluations of the PISA interface. Following the original project plan,there will be two rounds of usability evaluations. The first round, to take place early in 2003, will testthe interface concepts and implementation in the interface prototype, and the second, to take place in thefall of 2003, will be an evaluation of the complete PISA Demonstrator.

The objective of the first usability evaluaton is to determine if the PISA team was successful inconstructing an agent technology that users (1) can use, (2) can understand, (3) has addressed theirconcerns about security and privacy. This testing will involve standard usability techniques wheresubjects will try the software system on a computer and answer questions about the features andperformance that they experience.

This evaluation is being conducted in partnership with the Human Oriented Technology Laboratory(HOT Lab) at Carleton University. An HCI Master's degree candidate, Cassandra Holmes, will attemptto use unique remote usability testing methods to evaluate the PISA interface. A diagram depicting thesetup of these tests is shown in Figure 7.1. Participants in the test will interact with a client computer,which will run a standard WWW browser. This computer will retrieve the prototype WWW pages froma WWW server. The client computer will also be connected to a Data Capture computer using aprotocol called Virtual Network Computing (VNC), which will allow a copy of the client computer'sdisplay to be viewed and captured at the capture computer. Camatasia software will be used to recordall the actions made by the participant, and microphones will be used to capture all the comments madeby the experimenter and participant.

The testing will be done remotely because the participant and the experimenter will be in differentphysical locations, with the paricipant working at the client computer and the experimenter observing atthe data capture computer. All interactions between the participant and the experimenter will be doneusing remote collaboration tools, such as Microsoft NetMeeting. One of the goals of the researchproject is to test for differences when the interaction is done with a voice channel or a text-based "chat"channel.

Figure 7.1: Schematic represenation of remote usability testing setup.

Page 53: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 53 / 139

8. References6, P. (2001). Can we be persuaded to become PET-lovers? Paper presented at the OECD Forum

Session on Privacy Enhancing Technologies. Paris, Oct. 8.Bickford, P. (1997). Human interface online: A question of trust. Retrieved January 9, 2033 from:

http://developer.iplanet.com/viewsource/bickford_trust.htmlBickmore, T., & Cassell, J. (2001). Relational agents: A model and implementation of building user

trust. Proceedings of SIGCHI '01, March 31-April 4, Seattle, WA, USA. pp. 396-403Bradshaw, J.M. (1997). An introduction to software agents. In J.M. Bradshaw (Ed.), Software agents.

Menlo Park, CA: AAAI Press/MIT Press.Cheskin Research & Studio Archetype/Sapient (1999). eCommerce Trust Study.

http://www.cheskin.com/think/studies/ecomtrust.htmlComstock, E.M., & Clemens, E.A. (1987). Perceptions of computer manuals: A view from the field.

Proceedings of the Human Factors Society 31st Annual Meeting, 139-143.Council Of Europe (1981). Convention for the protection of individuals with regard to automatic

processing of personal data. Retrieved January 9, 2003 from:http://europa.eu.int/comm/internal_market/en/dataprot/inter/con10881.htm

Cranor, L.F., Arjula, M., & Guduru, P. (2002). Use of a P3P User Agent by Early Adopters.Proceedings of Workshop on Privacy in the Electronic Society. Washington, D.C., November 21.

Cranor, L.F., Reagle, J., & Ackerman, M.S. (1999). Beyond concern: Understanding net users'attitudes about online privacy. AT&T Labs-Research Technical Report TR 99.4.3.http://www.research.att.com/library/trs/TRs/99/99.4/

European Commission (1995). Directive 95/46/EC of the European Parliament and of the Council of 24October 1995 on the Protection of Individuals with Regard to the Processing of Personal Data andon the Free Movement of such Data. Official Journal of the European Communities (1995), p. 31.

European Commission (1997). Directive 97/66/EC of the European Parliament and of the Council of 15December 1997 concerning the processing of personal data and the protection of privacy in thetelecommunications sector. Official Journal L 024 , 30/01/1998 p. 0001 – 0008.

Erickson, T. (1997). Designing agents as if people mattered. In J.M. Bradshaw (Ed.), Software agents.Menlo Park, CA: AAAI Press/MIT Press.

Gabber, E., Gibbons, P., Matias, Y., & and Mayer, A. (1997) How to make personalized web browsingsimple, secure, and anonymous. Proceedings of Financial Cryptography 97, February, 1997,Springer-Verlag, LNCS 1318. http://www.bell-labs.com/project/lpwa/papers.html

Grandison, T., & Sloman, M. (2000). A survey of trust in Internet applications. IEEE CommunicationsSurveys, Fourth Quarter 2000.http://www.comsoc.org/livepubs/surveys/public/2000/dec/grandison.html

Halket, T.D., & Cosgrove, D.B. (2002). Is your online agreement in jeopardy? Retrieved January 9,2003 from the CIO.com Web Site: http://www.cio.com/legal/edit/010402_agree.html

Kenny, S., & Borking, J. (2002). The value of privacy engineering. Journal of Information, Law andTechnology (JILT). http://elj.warwick.ac.uk/jilt/02-1/kenny.html.

Kim, J., & Moon, J.Y. (1998). Designing towards emotional usability in customer interfaces --trustworthiness of cyber-banking system interfaces. Interacting with Computers, 10, 1-29.

Kobsa, A. (2001). Tailoring privacy to users' needs (Invited Keynote). In M. Bauer, P. J.Gmytrasiewicz and J. Vassileva, Eds. User Modeling 2001: 8th International Conference. Berlin -Heidelberg: Springer Verlag, 303-313. http://www.ics.uci.edu/~kobsa/papers/2001-UM01-kobsa.pdf

Kobsa, A. (2002). Personalized hypermedia and international privacy. Communications of the ACM,45(5), 64-67. http://www.ics.uci.edu/~kobsa/papers/2002-CACM-kobsa.pdf

Page 54: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 54 / 139

Kunz, C.L. (2002). Click-Through Agreements: Strategies for Avoiding Disputes on Validity ofAssent. http://www.efscouncil.org/frames/Forum%20Members/Kunz_Click-thr_%20Agrmt_%20Strategies.ppt. See also C.L. Kunz, J. Debrow, M. Del Duca, and H. Thayer,“Click-Through Agreements: Strategies for Avoiding Disputes on Validity of Assent,” BusinessLawyer, 57, 401 (2001).

Laurel, B. (1997). Interface agents: Metaphors with character. In J.M. Bradshaw (Ed.), Softwareagents. Menlo Park, CA: AAAI Press/MIT Press.

Lee, J., Kim, J., & Moon, J.Y. (2000). What makes Internet users visit cyber stores again? Key designfactors for customer loyalty. Proceedings of CHI '2000, The Hague, Amsterdam. pp. 305-312.

Lieberman, H. (2002). Interfaces that give and take advice. In J.M. Carroll (Ed.), Human-ComputerInteraction in the New Millennium. N.Y.: ACM Press, 2002.

Marsh, S. (1994). Formalising trust as a computational concept. PhD Thesis, University of Stirling,Scotland. http://www.iit.nrc.ca/~steve/Publications.html

Mayhew, D.J. (1999). The Usability Engineering Lifecycle: A Practitioner's Handbook for UserInterface Design. Morgan Kaufmann.

Negroponte, N. (1997). Agents: From direct manipulation to delegation. In J.M. Bradshaw (Ed.),Software agents. Menlo Park, CA: AAAI Press/MIT Press.

Nielsen, J. (1993). Usability engineering. San Diego, CA: Morgan Kaufmann.Nielsen, J. (1993). Usability Engineering. Boston, MA: Academic Press.Norman, D.A. (1990). The Design of Everyday Things. Currency/Doubleday.Norman, D.A. (1997). How might people interact with agents. In J. Bradshaw (Ed.), (1997). Software

agents . Menlo Park, CA and Cambridge, MA: AAAI Press/The MIT Press.http://www.jnd.org/dn.mss/agents.html

Norman, D.A. (2001). How might humans interact with robots? Human robot interaction and the lawsof robotology. Keynote address to a DARPA/NSF Conference on Human-Robot Interaction, SanLuis Obispo, CA, Sept., 2001. http://www.jnd.org/dn.mss/Humans_and_Robots.html

Patrick, A.S. (2002) Building trustworthy software agents. IEEE Internet Computing, 6(6), 46-53.Patrick, A.S., & Kenny, S. (2002). From Privacy Legislation to Interface Design: Implementing

Information Privacy in Human-Computer Interfaces. Paper submitted for publication.http://www.andrewpatrick.ca/legint/privacy-interfaces.pdf

Preece, J., Rogers, Y., Sharp, H., Benyon, D., Holland, S., & Carey, T. (1994). Human-computerinteraction. Reading, MA: Addison-Wesley.

Reagle, J., & Cranor, L.F. (1999). The platform for privacy preferences. Communications of the ACM,42, 48-55.

Reidenberg, J (2001). E-commerce and trans-atlantic privacy. Houston Law Review 2001, 38.Riegelsberger, R. & Sasse, M.A. (2001). Trustbuilders and trustbusters: The role of trust cues in

interfaces to e-commerce applications. Presented at the 1st IFIP Conference on e-commerce, e-business, e-government (i3e), Zurich, Oct 3-5 2001.http://www.cs.ucl.ac.uk/staff/jriegels/trustbuilders_and_trustbusters.htm

Rocco, E. (1998). Trust breaks down in electronic contexts but can be repaired by some initial face-to-face contact. Proceedings of CHI 98, Los Angeles, USA. pp. 496-502.

Rotter, J.B. (1980). Interpersonal trust, trustworthiness, and gullibility. American Psychologist, 35(1),1-7.

Rumbaugh, J., Jacobson, I., & Booch, G. (1998). The unified modeling language reference manual.Addison-Wesley.

Saunders, C. (2001). Trust central to E-commerce, online marketing. Internet Advertising Report.http://www.internetnews.com/IAR/article.php/12_926191

Page 55: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 55 / 139

Shneiderman, B. (1987). Designing the user interface: Strategies for effective human-computerinteraction. Reading, MA: Addison-Wesley.

Shneiderman, B. (1997). Designing the User Interface: Strategies for Effective Human-ComputerInteraction. Addison-Wesley.

Slade, K.H. (1999). Dealing with customers: Protection their privacy and enforcing your contracts.http://www.haledorr.com/db30/cgi-bin/pubs/1999_06_CLE_Program.pdf

Thornburgh, D. (2001). Click-through contracts: How to make them stick. Internet ManagementStrategies. http://www.loeb.com/FSL5CS/articles/articles45.asp

Weisband, S.P., & Reinig, B.A. (1995). Managing user perceptions of email privacy. Communicationsof the ACM, 38(12), 40-47.

Westin, A.F. (1991). (with Louis Harris & Associates). Harris-Equifax Consumer Privacy Survey.Atlanta, GA: Equifax, Inc.

Whitten, A. & Tygar, J.D. (1999). Why Johnny can't encrypt: A usability evaluation of PGP 5.0.Proceedings of the 9th USENIX Security Symposium, August 1999.http://www.cs.cmu.edu/~alma/johnny.pdf

Wickens, C.D., & Hollands, J.G. (2000). Engineering psychology and human performance (3rd Ed.).Upper Saddle River, NJ: Prentice Hall.

Youll, J. (2001). Agent-Based Electronic Commerce: Opportunities and Challenges. Positionstatement for panel discussion on Agent-Based Electronic Commerce: Opportunities andChallenges, in the 5th International Symposium on Autonomous Decentralized System with anEmphasis on Electronic Commerce, March 26-28, 2001, Dallas, Texas, USAhttp://www.media.mit.edu/~jim/projects/atomic/publications/youll-mit-isads.pdf

Zand, D.E. (1972). Trust and managerial problem solving. Administrative Science Quarterly, 17, 229-239.

Page 56: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 56 / 139

9. Appendices

9.1 Detailed PISA Interface Requirements Analysis

A Detailed Privacy Interface Worksheet Completed for the PISA Demonstrator Example(see end of table for definitions)

Number Principle Principle Applied to PISA Case HCI Implications PISA Solutions1 Transparency (Directive

Reference: Articles{10a,b,c /11-1a,b,c,11-2 /13 -1a,c,d,e,f,g,13-2})

Transparency is where a Data Subject (DS) or agentrepresentative is empowered to comprehend the nature ofprocessing applied to her personal data.

users must beaware of thetransparencyoptions, and feelempowered tocomprehend andcontrol how theirPII is handled

The transparencyfeatures of thesoftware are displayprominently in thePISA interface sotheir operation isobvious.

1.1 DS is aware oftransparency opportunities

The DS must be aware that he is empowered to find out exactlywhat has happened to his PII

users must beaware of thetransparencyoptions

during agentcreation,transparencyfeatures areexplained. Whenagents are listed,opportunity to trackactions is shown. Areal-time trackingwindow is alsoshown.

1.1.1 For: PersonallyIdentifiable Information(PII) collected from DS bythe controller

Data will be collected from the DS and used, therefore...

1.1.1.1 Prior to DS PII capture:DS informed of: controllerIdentity (ID) / PurposeSpecification (PS)

Prior to any DS PII capture: DS informed of: Controller ID /PS / Location – last two have sensitivities appended

users know who iscontrolling theirdata, and for whatpurpose(s)

at agent creation,user is informed ofidentity, controller,processing purpose,etc. This will be aJITCTA.

1.1.1..2 Prior to DS PII capture:DS informed of: controllerID / PS {& Article 10 (c)if PII sensitivity / PSsensitivity is not low}(Article 10 (c) relates toadditional information tobe passed to the datasubject so as to guaranteefair processing whencircumstances indicateadditional transparency isbeneficial. There is nosingle defined set ofrequirements for thisadditional information,because the idea is that itshould be seen in the lightof the specificcircumstancessurrounding theprocessing, but the data tobe passed from controllerto data subject shouldreflect the controllerscommitments toresponding to data subjectqueries (a customercharter) and reaffirmationof the access rights.)

a DS will input data into a field in the GUI. Every field isdesignated as having L-M-H sensitivity. Data input is thentagged with this sensitivity and acts as persistent metadata. Ifany data input is not low then A10(c) must also be offered toDS.

Page 57: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 57 / 139

Number Principle Principle Applied to PISA Case HCI Implications PISA Solutions1.1.2 For: PII not collected from

DS but from processorThe JSA may distribute the PII to the JMAA or other agents –the JMAA becomes the processor. Before doing this there mustbe a contract agreement between parties. This is either an APS&/or a P3P policy exchange. The policy in either case says thatthe processor must act in accordance with controllerinstructions – this could usefully have non-repudiation as aproperty. The controller is responsible for enforcing theseinstructions.

1.1.2.1 DS informed by controllerof: processor ID / PS

Once contracts have been exchanged, the JSA should request:processor ID; processor PS, processor location – of theprocessing agent. DS is informed of this information. This datacould be validated through some POK approach. If all of thisdata is either not offered or is invalidated then JSA should firstlog all interaction data regarding this agent, report the agent toData Protection Authority and reject agent as a potentialprocessor (rule modified with PET). Processor PS, processorlocation have sensitivities appended.

users are informedof each processorwho processes theirdata, and they usersunderstand thelimits to thisinforming

- user agreementsstates that PII can bepassed on to thirdparties

- when viewing theprocessing logs,entries with limitedinformation arecolor coded to drawattention, and usersare reminded aboutthe trackinglimitations

1.1.2.2 Prior to processor PIIcapture controller & DSalso informed of A10 (c) ifPII sensitivity &/ or PSsensitivity is not low (Wedefine two classes ofsensitivity, "low" and "notlow" and associate them tothe concepts of purposespecification, location andpersonal data itself. Forinstance, the purposespecification of DirectMarketing is defined as aspecial case of purposespecification forcing thegeneral provisions ofcorrect processing to beapplied in a morestringent way, and istherefore associated witha not low sensitivity. Acontroller or for thatmatter a processor locatedinside the EU is associatedwith a low sensitivity forconcept of location. Usingthis terminology, theDirective explicitly definesnot low sensitivitycategories of personaldata.)

If JSA holds any data that is not low sensitivity then in additionJSA should request A10 (c) from processor – JMAA. Thisshould also be requested in the case if PS sensitivity of JMAAis not low

1.1.3 If DS is not informed of1.1.2.2 then processoracceptability assessed onall of 1.1.3.1-6 applying

If DS is not informed of agent processing and none of theconditions below (1.1.3.1-6) apply, then JSA should rejectagent as a potential processor

||

V

- user agreementcontains informationabout usage trackinglimitations

1.1.3.1 DS received priorprocessing notification

has the processor been identified previously and is thisauthenticated

don't continuallyinform user ofprocessing ifalready viewed

processing by thesame agent onlyneeds to be loggedonce

1.1.3.2 Processor PS is legalregulation

is the PS the JMAA offering legal_regulation and is itauthenticated

Users will be madeaware that theirdata could beprocessed byagents that havelegal status

the user agreementand the trackinginterface will informusers that they willnot be informed ofprocessing in casesof...

1.1.3.3 Processor PS is security ofthe state

is the PS security_of_state and is it authenticated ^||

Page 58: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 58 / 139

Number Principle Principle Applied to PISA Case HCI Implications PISA Solutions1.1.3.4 Processor PS is prevention

/ detection / prosecution ofcriminal offences

is the PS prevention_detection_prosecution_criminaloffencesand is it authenticated

^||

1.1.3.5 Processor PS is economicinterests of the state

is the PS economic_interests_ofthestate and is it authenticated ^||

1.1.3.6 Protection of rights andfreedoms (of otherpersons) (According toarticle 7(f) It is possiblefor some generaltransparency conditions tobe overridden bycontrollers acting inm’loco parentis – wherethey determine thefundamental rights andfreedoms of the datasubject necessitate suchactions. As this ground inany case would beinterpreted in a restrictiveway, and because thisground is so contextspecific, we are reluctantto take any pre-emptivejudgment and thereforeexclude this element.)

is the PS protection of the DS or the rights of other persons ^||

1.2 Inform exceptions The Two PS below if supplied and authenticated mean that noother information need be supplied:

||

V

the user agreementand trackinginterface will informusers that they willnot be informed ofprocessing in casesof...

1.2.1 DS does not need to beinformed if PS is scientific/ statistical – PII is thenanonymised in such a waythat its use is bound to thispurpose (This is anexception condition of theDirective, and is to beinterpreted in a restrictiveway in that only inspecified, establishedsituations are permissible.An example of such asituation would be wherethe governmental body ofthe National StatisticsOffice authenticates itselfto a controller to whomthe data subject entrustedhis personal data.)

If the PS supplied from JMAA & authenticated to JSA isscientific_statistical then JSA need no other information;however, JSA is to anonymise PII prior to sending to JMAAand log the transaction

Users must bemade to understandthat statisticsprocessing is doneanonymously.

...

1.2.2 DS does not need to beinformed if the PII aresubject to any other lawgoverning their processing/ storage

If the PS supplied to JSA from JMAA is law_regulation thenJSA need no other information prior to sending to JMAA

Users will be madeaware that theirdata could beprocessed byagents that havelegal status

...

1.3 PII super distribution A controller (JSA) intending to pass PII to a processing agentwhose PS is Direct_Marketing must apply 1.3.1-2

Users will havecontrol overwhether theirinformation ispassed to directmarketers.

Page 59: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 59 / 139

Number Principle Principle Applied to PISA Case HCI Implications PISA Solutions1.3.1 DS receives objection

notification possibilityDS must be informed of absolute_objection prior to DS PIItransfer, while agent processors must recognise that an absoluteobjection requires it to both terminate processing the PII it isabout to receive, and ensure its recipients of the PII alsocomply with the termination – it is the JSA’s responsibility todo this but he could be helped by the system architecture

users understandthat they can objectto marketingprocessing of theirPII, and thelimitations on thoseobjections

- during agentcreation, users mustopt-in to processingfor direct marketingor charitablepurposes- to ensureunderstanding andawareness, users aregiven examples anda JITCTA is usedfor final acceptance- users are alsoreminded of theiropt-in/out option inthe Modify Agentinterface screen

1.3.2 This notification mayoccur every 30 days

the above objection to disclosure option should be made everythirty days

- the interfaceshould make theopt-in options easyto find &understand

- the opt-ininstructions aredisplayed each timethe JSA parametersare viewed ormodified- if JSA has existedfor 30 days and userhas chosen to opt-in,program shouldconfirm this is stilltheir choice- this will be aJITCTA

2 Finality & PurposeLimitation (DirectiveReference: Articles {6-1b,e, 6-2}).

Finality is where the DS’s PII use and retention is bound to thepurpose to which it was collected from the DS.

users control theuse and storage oftheir PII

interface elementsfor making privacydecisions areprominent andobvious

2.1 The controller has"permission". SeePrinciple 3.1 for definitionand conditions.

DS must be presented with a consent decision from JSA forprocessing. This consent request must contain informationabout processing purposes.

users giveinformed consentto all processing ofdata

during agentcreation, interfaceshould present andobtain unambiguousconsent toprocessing for thepurposes stated bythe JSA. This willbe a JITCTA.

2.2.1 Obligations: A controllermust process according tohis PS, controller alsoensure processors presenta PS to be considered arecipient of the PII

The JSA must obtain information about the PS of anyprocessing agents before PII is transfered. The JSA mustcompared the proposed PS with its own to ensure there isagreement.

users understandthat their PII couldbe used for otherpurposes in specialcases

- user agreementsstates that PII can(must) be passed onin special cases- when viewing theprocessing logs,entries with limitedinformation arecolor coded to drawattention, and usersare reminded aboutthe special cases

2.2.1.1 When assessing aprocessor the controllerconsiders PII sensitivityand the similarity ofprocessor PS to controllerDS consented PS whenassessing the processor PSfor the DS PII

The test for the closeness of match between PS of theprocessing agent and the PS of the JSA is governed by thesensitivity of the information involved. If the PII is low-sensitivity, then the match does not have to be exact.

2.2.1.2 <2.2.1.1> variablepermutation for sensitiveinformation

If the PII sensitivity level is high then the JMAA PSsimultaneously must be high if JSA is to accept JMAA as aprocessor

2.2.1.3 <2.2.1.1> variablepermutation plus claimedprocessor locationjurisdiction

If the processor location sensitivity level is not low, PIIsensitivity is not low, and PS sensitivity level not low, thenJSA should not accept JMAA as a processor.

Page 60: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 60 / 139

Number Principle Principle Applied to PISA Case HCI Implications PISA Solutions2.2.2 If any of 2.2.1.1-3 produce

negative processorselection decisions, this isrescinded if any one of<2.2.2.1-5> hold true.

If there is a poor match of PS's, transfer of PII to the processorcan still take place if...

Users will be madeaware that theirdata could beprocessed byagents that havelegal status

The user agreementwill explain that PIIcan be passed tothird parties underthese conditions...

2.2.2.1 The processor validatedPS is state security

the validated PS supplied from JMAA is state_security ^||

2.2.2.2 The processor validatedPS is prevention /detection / prosecution ofcriminal offences

the validated PS supplied from JMAA isprevention_detection_prosecution_ofcriminaloffences

^||

2.2.2.3 The processor validatedPS is economic interests ofthe state

the validated PS supplied from JMAA iseconomic_interests_ofthestate

^||

2.2.2.4 Protection of rights andfreedoms (of otherpersons) (According toarticle 7(f) It is possiblefor some general finalityand purpose bindingconditions to beoverridden by controllersacting in m’loco parentis –where they determine thefundamental rights andfreedoms of the datasubject necessitate suchactions. As this ground inany case would beinterpreted in a restrictiveway, and because thisground is so contextspecific, we are reluctantto take any pre-emptivejudgment and thereforeexclude this element.)

the validated PS supplied from JMAA is rightsandfreedoms ^||

2.2.2.5 The processor validatedPS is scientific / statistical

the validated PS supplied from JMAA is scientific_statistical ^||

2.3 Retention: DS presentedwith controller PS & aproposed retention period(RP), prior to beingpresented with a opt inconsent decision forhaving PII processed,except where PS isscientific / statistical.Controller ensuresprocessor comply with RP,in all cases except wherePS is scientific / statistical

a RP is associated with a PS and informed to DS in everyinstance except for the PS of scientific_statistical. JMAA mustbe informed by JSA and agree to adhere to the RP

The user interfacefor data entry mustcontain a field forthe retentionperiod. Users mustbe shown that thisretention period isbeing honoured(e.g., message fromprocessor whendata expires and isdeleted.)

During agentcreation, a retentionperiod for the PIImust be collectedand used.

2.3.1 When RP expires, it ispreferably deleted or madeanonymous in all itsinstances.

if the RP expires then operations need to be performed on thePII by both JSA & JMAA that preferably delete, or anonymisethe PII wherever it is held.

users are aware ofwhat happens totheir data when theretention timeexpires

In the trackinginterface, users areinformed wheninformation isdeleted or madeanonymous becauseof retention periodexpiry.

2.3.2 A record should be kept ofprocessor and controllerpast adherence to RPs’

JSA should lookup the past behavior of a processing agentbefore transfering PII.

If a processing agentfails to obey a deleteinstruction, thisinformation isrelayed to the user,stored and will beused to block furtherPII transfer to thatagent.

Page 61: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 61 / 139

Number Principle Principle Applied to PISA Case HCI Implications PISA Solutions3 Legitimate Processing

(Directive Reference:Articles {6-1a,7a,b,c,d,e,f,8-1,8-2a,b,c,d,e,8-3,8-4,8-5})

Legitimate Processing (LP) is where the DS’ PII is processedwithin defined boundaries.

3.1 Permission: Tolegitimately process PII,controller ensures that ifone or more of 3.1.1-6validate; processor mustassess if 3.1.1.3-6 validate.Processor may not actupon this decision withoutnon-repudiable affirmationof controller

Basis of Processing for PA must be determined before the PApermits itself to carry out the activities the DS wants – PAassesses if any one <3.1.1-6> holds...

users control theboundaries inwhich their PII isprocessed

The interfaceprovides controlsfor giving consent,agreeing to acontract, andunderstanding thespecial cases wheredata can beprocessed without acontract.

3.1.1 The DS is presented withcontroller PS & other datasuch as retention periodproposed prior to opt inconsent decision

If DS has given his unambiguous consent for PA to process hisPII

users mustunambiguouslygive permission fordata sharing

This is important, soduring agentcreation a JITCTAwill be used toconfirmunambiguousconsent

3.1.2 The DS unambiguouslyrequests a servicerequiring performance of acontract

If DS requests a service that requires the performance of acontract; and DS is aware of the effects, PA can process hisPII. Performance of a contract has to be deduced from the DSasking JSA to perform a task that will require some type ofcontract to be initiated – most obviously a purchase of a serviceor product. The question here is to ensure that the DS is awarethat when he does not consent to his PII being used, but hewants to use the agent to perform activities that requirecontract performance, then his PII will be used.

users giveinformed consentto all processing ofdatausers understandwhen they areforming a contractfor services, andthe implications ofthat contractusers understandthe special caseswhen their datamay be processedwithout a contract

A JITCTA willcontain informationthat PII must beprocessed in orderto provide the jobsearching servicebeing requested.The agreement willclearly state that theuser is entering introa contract with thePISA system.

3.1.3 Controller / processorsubject to a PS of legalobligation (the directivedoes not talk about thelegal obligations of theprocessors, only of thecontrollers, in article 7)

If JSA is subject to an access request from another agent thatclaims legal_obligation as its PS then JSA must authenticatethis and then JSA is obliged to distribute the PII to this agent.If JMAA is subject to a PS of legal_obligation for access to thePII then this must be authenticated followed by JMAArequesting (logged) from JSA confirmation that JMAA cangive the PII to the requestor. JSA must send an agree to theJMAA if the authentication checks out and inform the DS.

Users will be madeaware that theirdata could beprocessed byagents that havelegal status

The user agreementwill containinformation that PIImust be processed ifthe JSA is undersome legalobligation. Anexample might be acorporate mergerthat requires transferof all customerrecords.

3.1.4 Vital interests of the DS atstake – (This is applicablein limited applicationdomains which relate tostrictly interpreted life anddeath type situations.Within such types ofclearly definable domains,this exception will hold tovalidate this form oflegitimate processing.)

3.1.5 Controller / processorsubject to a PS of publicadministrative

If JSA is subject to an access request from another agent thatclaims public_administrative as its PS then JSA mustauthenticate this and then is obliged to distribute the PII to thisagent. If JMAA is subject to a PS of public_administrative foraccess to the PII then this must be authenticated followed byJMAA requesting (logged) from JSA confirmation that JMAAcan give the PII to the requestor. JSA must send an agree to theJMAA if the authentication checks out and inform the DS.

Users will be madeaware that theirdata could beprocessed byagents that havelegal status

The user agreementwill containinformation that PIImust be processed ifthe JSA receives avalid request that isrelated to publicadministration. Anexample might be alaw enforcementagency or court.

Page 62: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 62 / 139

Number Principle Principle Applied to PISA Case HCI Implications PISA Solutions3.1.6 If there is a match between

the PS the DS opt’s in to,and the PS proposed bycontroller / processor – butonly if all of 3.1.6.1-4 arefalse:

PA can process PII if is clear that the DS wants to use theservice, but also has not explicitly consented to the use of hisPII (similar to 4.1.2) – we need to match the PS the DS isinterested in with the PS of the PA – this is called thelegitimate interests of the controller. However if this isacceptable depends on all of the following being false:

The system must beable to compare thePS of a proposedprocessor and thePS the user agreedto when they createdthe agent. There aresome things toconsider whenmaking thatdecision...

3.1.6.1 The controller / processorintends passing the PII to aprocessor

JSA discloses he will pass on the PII to another legal entity –the JMAA

^||

Processing is doneby another entity, solegitimate interestscannot be claimedand 3.1.6 is mootfor the PISAdemonstrator

3.1.6.2 The controller / processoractual PS differs from thePS consented to by DS

JSA will send PII to JMAA who has a different PS to the onethe DS agreed to

^||

3.1.6.3 The controller / processoris not located in the EU

PA is not located in the EU ^||

3.1.6.4 Fundamental right to beleft alone is violated. (Wedetermine the fundamentalright to be left alone to betoo complex to model asits character is determinedby the context of thesituation. Predicting thesituation so as to engagedeterministic actions ofagents carries, at this sateof agent evolution, toohigh a degree of error.)

^||

3.2 Prohibition: Thecontroller may not processany PII that is categorizedas religion / philosophicalbeliefs / race / politicalopinions / health / sex life/ trade union membership /criminal convictionsunless <3.1.3> isaffirmative & thecontrollers ID matches thePS (Nationalinterpretation of suchspecial categories of dataresults in no common EUposition of categories ofprocessing sensitive data,although Article 8 doesnot give much margin ofappreciation to themember state. The defaultsolution for this EU wideadvice is therefore to deferto the highestinterpretation. Matching aPS with a controllers IDdoes this in accordancewith a strong article 8-2(a) interpretation.)

The PA may not accept any data that belongs to the field:Religion / philosophical beliefs / race / political opinions /health / sex life / trade union membership / criminalconvictions unless legitimate_grounds PS has been supplied byPA & validated by DS himself through an ‘authenticationview’ and DS gives unambiguious consent

when dealing withhighly sensitiveinformation(religion, race,etc.), usersprovide explicit,informed consentprior to processing

if sensitiveinformation isprovided by theuser, use a doubleJITCTA to obtainunambiguousconsent for itsprocessing

4 Rights (DirectiveReference Article{12a,b,c,14a,b})

The Data Subject has the right to self determination within theboundaries and balance of The Directive.

Page 63: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 63 / 139

Number Principle Principle Applied to PISA Case HCI Implications PISA Solutions4.1 DS is conscious of her

rightsThe DS is conscious of her rights users are conscious

of their rights,which include rightto know who hasreceived their data,from whom, when,and why, and theyunderstand theexceptions to theserights

during agentcreation, PISA willuse JITCTAs toensure that usersknow their rights.Also, the functionsto control how theirinformation isprocessed aredisplayedprominently in theinterface.

4.1.1 The DS has right toretrieve this data on PIIprocessing: (1) who hasreceived it; (2) who gavethem it (only if available) ;(3) when; (4) for what PS& (5) if a delete /anonymise operation hasbeen acknowledged &authenticated.

The system will have to track all processing of PII and makethe logs available to the user. The logs must contain, for eachprocessor:- identity- source of PII- when- processing purpose- if data has been deleted or anonymized

users understandand can exercisetheir rights toerase, block,rectify, orsupplement theirPII

User can ask toview and modifydata held by the JSA- when viewing thelogs, users canaccess detailedinformation abouteach processor

4.1.2 If the proposed PS is anyone of 4.1.2.1-5, then (1)(3) (4) from above can stillbe informed to the DS

If JMAA or JSA is presented with a PS of 6.1.2.1-5 then thefull disclosure to the repository of attribute for the report doesnot have to take place. It must be assessed if these PS are valid;by POK for instance; if they are not informData_Protection_Authority by email of the violation identifiedand information of that agent gathered during the interaction.

Users will be madeaware that theirdata could beprocessed byagents that havelegal status. Whenagent activity istracked, the statusof these specialprocessors willhave to be evident.

The trackinginterface shouldindicate when fullinformation is notbeing provided by aprocessor, and thegrounds for thewithholding ofinformation...

4.1.2.1 PS is state security the PS is state_security ^||

4.1.2.2 PS is prevention /detection / prosecution ofcriminal offences

the PS is prevention_detection_prosecution_criminaloffences ^||

4.1.2.3 PS is economic interests ofthe state

the PS is state_economicinterests ^||

4.1.2.4 PS is legal regulation the PS is legal_regulation ^||

4.1.2.5 Protection of rights andfreedoms (of otherpersons) - (According toarticle 7(f) It is possiblefor some general finalityand purpose bindingconditions to beoverridden by controllersacting in m’loco parentis –where they determine thefundamental rights andfreedoms of the datasubject necessitate suchactions. As this ground inany case would beinterpreted in a restrictiveway, and because thisground is so contextspecific, we are reluctantto take any pre-emptivejudgment and thereforeexclude this element.)

the PS is protection of rights and freedoms ^||

Page 64: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 64 / 139

Number Principle Principle Applied to PISA Case HCI Implications PISA Solutions4.1.3 If the DS is below the age

of consent then an accessrequested by her legalrepresentative (LR) – inboth cases, authenticationproportional to PII datasensitivity

The DS should identify her age group as either below 16 or 16or above. If the DS is less than 16 then she is prompted torequest a legal_representative of age 16 or above to submit atype of the access request.

interface musttrack if the user is aminor

During agentcreation, info iscollected if the useris under 16. If theuser is below 16then they mustprovide a legalrepresentative. Thisinformation isretained by the JSAand used todetermine who canobject to processing(the user or theirrepresentative).

4.2 Inform: Post DS / LRauthentication, either maylodge erase; block; rectifyor supplement commandson the DS PII

The DS is presented with options of: erase; block; rectify;supplement. These functions could be implemented byquerying each processor or by accessing a central repository ofinformation (PII).

users are consciousof their rights, theycan exercisecontrol over theirdata, which includeright to know whohas received theirdata, from whom,when, and why,and theyunderstand theexceptions to theserights

The interface willprominently displaythe controls tochange, erase, etc.

4.2.1 The result of 4.2 isinformed to the DS / LRwithin 30 days. Thecommunication is either:request accepted andexecuted; request deniedand explanation

The DS is informed within 30 days of the outcome of thesefour options above – she will be informed either as requestaccepted and executed; or request denied and report.

The results of allchange requests areshown to the user inthe tracking logs.

4.2.2 If the PII will be uneditable due to the storagestrategy applied, then DSis informed & asked toconsent prior to enteringany PII

DS must be informed by PA if her PII will be uneditable due tostorage approach applied, prior to her PII being transferred – soPA should ‘know’ its permissions management.

users are informedwhen data will beuneditable and theyprovide consent toprocessing

In the PISA case, allPII will be editableby the user.

4.2.3 Controller is accountablefor the correct executionof DS requests for erase,block, rectify orsupplement operations

JSA must communicate to DS and request from JMAA (&other recipient agents who have received the PII) if the requestfor erase, block, rectify or supplement have been met. A logicalproof that recipient agents have received and understood theJSA commands is perhaps possible.

users must receiveconfirmation thattheir modifyrequests have beencarried out

The JSA willconfirm that allchange requests areexecuted, andinform the userupon completion.

4.3 Objections: The DS canmake either a relative orabsolute objection

DS can object to JSA processing – the objection is determinedto be of one of two types – relative_objection andabsolute_objection being visualized by the user.

users areempowered toobject toprocessing forcertain purposes

the tracking logscontain a prominentfunction to object tothe processing

4.3.1 A relative objection isoffered if DS has notgiven her consent, and thePS is not Direct Marketingbut is publicadministrative & / or apply3.1.6

Users must be able to object to processing where the PS waspublic administration

interface forobjecting tospecific agents,either before agentlaunch or after thefact based ontracking data

When viewing thetracking logs, usersare given the optionof launching arelative objectionwhen PS was publicadministration

4.3.1.1 Controller determinesvaliditity of the objection.If PII sensitivity & / or PSsensitivity is not low thenthe relative objection isupheld. If not the relativeobjection is falsified andno further action is taken.

The PA / JSA determines if PII sensitivity & / or PS sensitivityis not low then the relative_objection is upheld. If not therelative_objection is falsified and no further action is taken.

Users must beinformed if theirobjections toprocessing havebeen rejected.

When handling arelative objection,the systemdetermines if the PIIis low sensitivityand if it is, theobjection can berefused

Page 65: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 65 / 139

Number Principle Principle Applied to PISA Case HCI Implications PISA Solutions4.3.1.2 If the relative objection is

valid the PII is deleted inall its instances (4.3.1.2 isrescinded by nationalvariations of certain PS inpublic sector)

if PA / JSA determines the relative_objection is upheld thenthey both terminate their processing of the PII by deleting itand ensure that all other recipients do the same thing. Again,the system must be transparent to show every agent who hasreceived the PII. Enforcing the deletion operation is veryimportant.

If a relativeobjection is upheld,than the PII must bedeleted in all theagents involved inthe transaction.Users will be able toconfirm the deletionin the tracking logs.

4.3.2 An absolute objectioninterface is offered if PS isdirect marketing – ifvalidated then the PII isdeleted in all its instances

An absolute_objection interface is made available to the DS ifPA / JSA PS is direct_marketing.

users can controlprocessing forcommercial orcharitable purposes

When viewing thetracking logs, usersare given the optionof launching anabsolute objectionwhen PS was directmarketing

4.3.2.1 Validity of the objection isdetermined by assessingthe authentication of thePS Direct Marketing.

The PA / JSA accepts that if an absolute_objection is made bythe DS through authentication of the PS direct_marketing thenhe must terminate processing himself, it is also hisresponsibility to ensure that all recipients of the PII like JMAAalso delete the PII. Clearly, the system must be transparent toshow every agent who has received the PII – how to enforcethe deletion operation is very important.

JSA will examinethe true purpose ofthe processor, notjust the statedpurpose, todetermine if it isdirect marketing

4.4 Agent autonomy andspecial PS. Certain PSsupplied by processor tocontroller or controller toDS gain an insight into apersons personality – forinstance – DirectMarketing – where this isso all of the followingmust be affirmative orprocessing is prohibited:

There are certain PS that will gain an insight into a personspersonality – for instance – Direct_Marketing, others may bedefinable: where these PS are specified all of the followingmust be assured:

users understandand are informedthat their behaviormay provide someinformation, andthey have providedconsent for theprocessing of thisinformation. Theyare alsoempowered toobject to thisprocessing

- the concept ofderived informationis explained atregistration, and anexample is provided- a JITCTA is usedto confirm consentto processing- processing logs orother results ofderived informationare alwayspresented with anobvious interfacefor

4.4.1 The DS unambiguouslyrequests a servicerequiring performance of acontract + has issuedexplicit consent

the DS has consented to enter a contract – it is clear that for allintents and purposes the DS wants to use this service

4.4.2 the DS can put forward anabsolute objection at anytime

the DS is clearly aware she may put forward an absoluteobjection at any time

4.4.3 the DS is prior informedof the PS

a PS of Legal_Regulation is also supplied and authenticated byan actor – if to the JMAA he must get JSA consent.

Definitions DS – data subject – the applicant or user of the system PII – Personally Identifiable Information – this is ANY information the DS inputs to the system, and information derived as the DS

uses the system that can be linked back to the user (identified) PA – personal agent JSA – Job Seeker Agent JMAA – Job market advisory agent – but in terms of privacy – JMAA can also be any other agent except PA & JSA PS – purpose specification – the reason stated for processing – there can be more than one per transaction RP – retention period – how long the data may be held by a controller or processor POK – Proof Of Knowledge – some asymmetric cryptographic approach Controller ID – this can start as an agent identifier, but must also include who controller is legally – like a tax registration number APS – Agent Practices Statement – this is very similar to a CPS Roles PA & JSA are the controller agents – though from our point of view they are considered as one. JMAA and any other agent in the system is a processor agent. The DS only has rights over the controller. The controller agent has to ensure that processor agent does as he should do with the DS’

PII. To do this he can ask for things from the processor agent, he will also need the help of the system architecture. The JMAA asprocessor is irrelevant for the DS – DS only has powers over the JSA as the controller – but JSA has to ensure that JMAA is doingthe right thing. So the JSA needs to know what the right thing is – he thus carries knowledge of this.

Page 66: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 66 / 139

9.2 HCI UML Diagrams for the PISA Demonstrator

Page 67: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 67 / 139

Page 68: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 68 / 139

Page 69: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 69 / 139

Page 70: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 70 / 139

Page 71: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 71 / 139

Page 72: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 72 / 139

Page 73: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 73 / 139

Page 74: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 74 / 139

Page 75: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 75 / 139

Page 76: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 76 / 139

Page 77: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 77 / 139

Page 78: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 78 / 139

Page 79: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 79 / 139

Page 80: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 80 / 139

Page 81: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 81 / 139

Page 82: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 82 / 139

9.3 Prototype Interface Code Documentation

This appendix describes the hierarchy of the PISA web interface, as well as presenting thenavigation between pages and scripts in an easily accessible manner.

Additionally, the descriptions of the pages and their purposes also documents some of the problemsencountered during their design, and the solutions that were used to eliminate or work around theseissues.

Index of HTML Pages and CGI Scripts

1. Index.html2. Greeting.html3. Login.html4. Register.html5. Help.html & Help2.html6. Personalagent.cgi7. Create.html8. Results.html9. Create1.cgi10. Create.cgi11. Createprefs.cgi12. Preferences.cgi13. Create_jitcta.html14. Confirmcreate.cgi15. Modify.cgi16. Interim.cgi17. Interim_agree.cgi18. Interim_disagree.cgi19. Track.cgi20. Trackagent.cgi21. Detailsview.cgi22. Objection.cgi23. Agent.cgi24. Agent_change.cgi25. Modifyagent.cgi

26. Disclosed.cgi27. Changeprofile.cgi28. Modifyprofile.cgi29. Newprofile.cgi30. Changepass.cgi31. Newpass.cgi32. Changeprefs.cgi33. Newpreferences.cgi34. Createresume.cgi35. Newresume.cgi36. Changeresume.cgi37. Getresume.cgi38. Firstprefs.cgi39. Registration.cgi40. Deregister.cgi and Logout.cgi41. Login.cgi42. Correct.cgi and Cancel.cgi43. Stopagent.cgi44. Sensitivedialog.html45. Startdaemon.pl46. Style.css47. Writemenu.js48. Cookiestuff.js49. Text file samples

Page 83: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 83 / 139

1. Index.html

Location: htdocs folder

Links: greeting.html (2) and note-greeting.html are called as part of the frame layout.

Page responsibilities: This html document is used to set the frame layout of the interfacepages, with content in the right frame, and a set of development notes displayed on theleft.

Notes on the Code:A simple frameset is used here to layout two columns, in one row. The widths are given as percentages, and can be changed as desired.The NAME attributes in the FRAME tags should not be changed. Only the SRCattribute can be safely changed.

Page 84: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 84 / 139

2. Greeting.html

Location: htdocs folder

Links: login.html (3) can be reached via hyperlink.note-greeting.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This html document is the launch page for the interface. It provides asingle link to the main login interface. NOTE: This page is a temporary page used indevelopment as an introduction, and should not appear in the final version of the webinterface.

Notes on the Code:Basic html code for a simple start page. Imports CSS file for style properties.

JavaScript condition code is introduced here to display a note page in the left-handframe. This code is as follows:<SCRIPT LANGUAGE="JavaScript">

<!--

if (parent.frames.Left) {

parent.frames.Left.location = "./notes/note-greeting.html";

}

//-->

</SCRIPT>

Page 85: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 85 / 139

3. Login.html

Location: htdocs folder

Links: register.html (4) can be reached via hyperlink.help2.html (5) can be reached via hyperlink.login.cgi (41) is called on form submission from the “Login” button.note-login.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.cookiestuff.js is imported to provide necessary cookie modification code.

Page responsibilities: This html document is the login page for the interface. From here,users can choose to register to use the system or login if they have previously registered.Special responsibilities include:• Displaying an error message if login script fails and redirects back to this page• Setting a “username” cookie on form submission

Notes on the Code:Basic html code, with one JavaScript code segment to display an error message:<SCRIPT LANGUAGE="JavaScript"> <!—

var pos_ques = document.URL.indexOf("?"); //find index of ‘?’ in URL, if it exists

if (pos_ques >= 0) {

document.write("<P><font color='#FF0000'>Incorrect User name or Password -

Please try again</font>");

} //--> </SCRIPT>

Page 86: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 86 / 139

4. Register.html

Location: htdocs folder

Links: login.html (3) can be reached via the “Cancel” button.help2.html (5) can be reached via hyperlink.firstprefs.cgi (38) is called on form submission from the “Register” button.note-register.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This html document is the registration page for the interface. Users willuse this page to enter their contact information, and other pertinent details aboutthemselves. A username and password are also chosen at this point.

Notes on the Code:This is the first occurrence of the “floating” images at the top of the page. They are setup as DIV tags within the html code, with a specific ID attribute:<div id="profile" style="position:absolute; top:15px; left:20px; width:150px;

height:50px; z-index:1;">

<img src="../profile2.gif" height="50">

</div><div id="logo" style="position:absolute; top:5px; left:25px; width:150px; height:50px;z-index:1;">

<img src="../PISA-logo3.gif" height="90">

</div>

Page 87: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 87 / 139

Notes, Cont’d:These DIV tags must have a unique ‘id’ attribute, and must have the ‘position’ variable

set to ‘absolute’ within the style attribute. The ‘top’ and ‘left’ variables are used to position theimage initially (though we will see how this can be changed later). The ‘width’ and ‘height’variables are the size of the image. The last variable, ‘z-index’, must be set to “1” so that theimage “floats” above all other page content.

The JavaScript code to re-position an image to the top, right corner of the page iscomplicated, because it must cover different properties for different types of browsers. It issuggested that changes to the code be avoided, since small alterations will cause drasticchanges in the behaviour of the images.

<SCRIPT LANGUAGE="JavaScript">

<!--

function DOMGetElement(o) {

if (document.getElementById) return document.getElementById(o);

else if (document.all) return document.all[o];

else if (document.layers) return document.layers[o];

return null;

}

function DOMWindowGetXOffset() {

if (document.all) return document.body.scrollLeft;

else if (document.getElementById) return window.pageXOffset;

else if (document.layers) return window.pageXOffset;

}

function DOMElementSetLeftPos(o,val) {

if (document.getElementById) o.style.left = val;

else if (document.all) o.style.left = val;

else if (document.layers) o.pageX = val;

}

function DOMWindowGetInnerWidth() {

if (document.all) return document.body.clientWidth;

else if (document.getElementById) return window.innerWidth;

else if (document.layers) return window.innerWidth;

}

function DOMElementGetWidth(o) {

if (document.all) return o.clientWidth;

else if (document.getElementById) return parseInt(o.offsetWidth);

else if (document.layers) return o.document.width;

}

function pageOffset() {

var o = DOMGetElement('profile');

if (o) {

DOMElementSetLeftPos(o, DOMWindowGetXOffset() + DOMWindowGetInnerWidth() -

DOMElementGetWidth(o) - 20);

setTimeout('pageOffset()',10);

}

}

Page 88: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 88 / 139

//-->

</SCRIPT>

Note that the id value of the image must be specified in the pageOffset() function. Thefinal code adjustment to the page is to call pageOffset() from the body onLoad event:

<body lang=EN-US link=blue vlink=purple style='tab-interval:.5in' onLoad='pageOffset()'>

Page 89: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 89 / 139

5. Help.html & Help2.html

Location: htdocs folder

Links: note-help.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.writemenu.js is imported to provide the navigation bar code for the page.

Page responsibilities: This html document is used to provide the user with help concerning theproper use of the interface. At present, this is a temporary document, and the contentshould be changed for the final interface.

Notes on the Code:The help page code contains many DIV tags to separate the content sections. Thisallows other pages to link to specific, relevant sections of the help page, and avoid therest of the content.

A simple DIV tag is created on the help page: <div id="login">A link to this section would look like: <a href="./help2.html#login">Help</a>

Page 90: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 90 / 139

6. Personalagent.cgi

Location: cgi-bin folder

Links: create.html (7) can be reached via the navigation bar or hyperlink.modify.cgi (15) can be reached via the navigation bar or hyperlink.track.cgi (19) can be reached via the navigation bar or hyperlink.results.html (8) can be reached via the navigation bar or hyperlink.help.html (5) can be reached via the navigation bar or hyperlink.logout.cgi (40) can be reached via the navigation bar.changeprofile.cgi (27) can be reached via hyperlink.note-main.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.writemenu.js is imported to provide the navigation bar code for the page.cookiestuff.js is imported to provide necessary cookie modification code.

Page responsibilities: This script displays the central page for the web interface. The dynamicmenu bar and links reflect what actions can be performed according to the number ofagents that the user has created. A status bar displays the number of agent messagesthat have yet to be read.

Notes on the Code:

Page 91: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 91 / 139

This page heavily relies on JavaScript code for the dynamic content. The set up for thelink involves some CGI script code, as well.

The CGI script parses the user’s cookie (set in login.html), and extracts the username,which is used to open the user’s profile file (created during registration), which contains acount of the user’s agents. This number is saved as a variable, and written into the html andJavaScript code conditions, where appropriate. From here, the JavaScript is a simple ifstatement to control the links:

<SCRIPT LANGUAGE="JavaScript">

<!--

if ($num > 0) {

// establish link only if agents exist

document.write("<a href='./modify.cgi' onMouseOver=showhelp('modifyhelp')

onMouseOut=hidehelp('modifyhelp')>");

}

//-->

</SCRIPT>

Modify Agent</a>

The leftover “</a>” tag does not affect the page if the script does not print the beginning of thelink tag.

The action code for the link calls JavaScript functions that display pop-up help boxes.The code itself is an amalgamation of the image positioning code, and some mouse-movementhandling code.

To display the number of waiting messages, a process similar to the above CGI scriptprocess is used. From the user’s cookie, the username is used to open the user’s interimmessages file. The size of the file is extracted, and a small bit of math is used to calculate thenumber of messages that the file contains. This number is printed in the code to tell the userhow many interim messages have been unanswered.

Interim messages code walkthrough:• ($dev,$ino,$mode,$nlink,$uid,$gid,$rdev,$size,$atime,$mtime,$ctime,$blksize,$blocks) =

stat("$FORM{'username'}"."_interim.txt");

The ‘stat’ operation on a file returns an array of information about the text file. We are onlyconcerned with $size. • if ($size > 5)

If the size of the file is significant (i.e. greater than 5 characters), we will read the file.• open(INTERIM, "$FORM{'username'}_interim.txt") or dienice("Can't open

$FORM{'username'}_interim.txt: $!");

• @interimbuffer = <INTERIM>;

Open the file, and extract the contents into a buffer array.• $messages = scalar(@interimbuffer);

Each line in the file is one member of the array. Therefore the length of the array is the numberof entries in the file.• $messages = $messages / 6;

Each interim message is six lines long in the text file. Therefore, the length of the file isdivided by six to determine the number of interim messages in the file. This number is stored

Page 92: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 92 / 139

in the variable $messages, and later written into the page code.

Page 93: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 93 / 139

7. Create.html

Location: htdocs folder

Links: personalagent.cgi (6) can be reached via the navigation bar.modify.cgi (15) can be reached via the navigation bar.track.cgi (19) can be reached via the navigation bar.results.html (8) can be reached via the navigation bar.help.html (5) can be reached via the navigation bar.logout.cgi (40) can be reached via the navigation bar.create1.cgi (9) is called upon form submission.note-create-1.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.writemenu.js is imported to provide the navigation bar code for the page.

Page responsibilities: This html document is the first page in a series of steps that will lead tothe creation of a new agent. This page allows you to label the new agent, and assign it atask. Currently, only Job Seek Agents can be created.

Notes on the Code:A hidden text field and a single line of JavaScript code are used to add a timestamp to the

agent, effectively providing each agent with a distinctive “birthday”.

Page 94: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 94 / 139

8. Results.html

Location: htdocs folder

Links: personalagent.cgi (6) can be reached via the navigation bar.create.html (7) can be reached via the navigation bar.modify.cgi (15) can be reached via the navigation bar.track.cgi (19) can be reached via the navigation bar.help.html (5) can be reached via the navigation bar.logout.cgi (40) can be reached via the navigation bar.note-results.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.writemenu.js is imported to provide the navigation bar code for the page.

Page responsibilities: This html document is used to display the final results only of anagent’s job search. This is a temporary page, and content will need to be added in thefinal version of the interface.

Page 95: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 95 / 139

9. Create1.cgi

Location: cgi-bin folder

Links: create.cgi (10) is called upon form submission.note-create-1.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays the second stage in the agent creation process.Once an agent has been named, and an action assigned to it, that information is storedby calling this script. If the user does not have a CV prepared, this script will notcontinue with the agent creation process. If they do, however, this script creates an htmlform in which search instructions can be provided, and a CV assigned to this new agent.

Notes on the Code:This script parses the user’s résumés as it loops through the user’s CV file. It extracts the

name of each CV to add to the list box. To handle the case where the user has not yet prepareda CV, a simple html page is written in a “dienice” subroutine that is executed if the CV file isnot found.

Page 96: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 96 / 139

10. Create.cgi

Location: cgi-bin folder

Links: createprefs.cgi (11) is called upon form submission.correct.cgi (42) is called if the “Make Corrections” button is pressed.note-create-2.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays an html document, which is used to confirm that thedata entered in the previous two stages of the agent creation process is correct. If thedata is incorrect, the user is given the option of restarting the process. If the data issatisfactory, the user may continue, and this script stores the instructions.

Notes on the Code:This script parses the user’s profile file, their CV file, and the submitted form data to

create the table of data for confirmation.

Page 97: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 97 / 139

11. Createprefs.cgi

Location: cgi-bin folder

Links: preferences.cgi (12) is called upon form submission.note-create-2.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays an html form consisting of a large grid ofcheckboxes that are used to set the handling preferences of a new agent. It draws uponthe first set of preferences selected at registration, and displays them as a default.Changes can be made to tailor the actions of this particular agent, and then submitted.

Notes on the Code:To display the saved preferences, a great deal of script code is used to parse the user’s

preferences file, extract the useful data, and then construct the html code to check the correctboxes.

The JavaScript code that is used to change the presets upon radio button click events is along set of “document.form.object.checked=true” statements. The internal documentationcovers these code sections in detail.

Page 98: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 98 / 139

12. Preferences.cgi

Location: cgi-bin folder

Links: create_jitcta.html (13) is displayed in a pop-up window upon form submission.confirmcreate.cgi (14) is called upon form submission (if jitcta is accepted).correct.cgi (42) is called upon “Make Corrections” button press.cancel.cgi (42) is called upon “Cancel” button press.note-create-2.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays an html page, which approaches the conclusion ofthe agent creation process. The agent’s instructions are echoed to the user forconfirmation, and the user is asked to continue, or make changes.

Notes on the Code:This page contains the code used to launch the JITCTA:function JITCTAWindow(){

JITCTADialog = window.open('../create_jitcta.html', 'newwindow', config =

'height=250, width=400, toolbar=no, menubar=no, scrollbars=no, resizable=no,

location=no, directories=no, status=no, modal=yes');

}

Page 99: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 99 / 139

13. Create_jitcta.html

Location: htdocs folder

Links: preferences.cgi (12) is modified by JavaScript code in this page.

Page responsibilities: This html document is displayed in a pop-up window when the userconfirms an agent’s instructions. This Just-In-Time-Click-Through-Agreement obtainsunambiguous consent from the user. Please see Dr. Andrew Patrick’s paper onJITCTA’s for an in-depth look at these widgets.

Notes on the Code:Preferences.cgi contains a hidden text field that is used to store the action of the user in

response to this JITCTA window’s prompt. The JavaScript code modifies this field, and closesthis window, when a user clicks “I agree” or “I do not agree”.

It accesses the parent page’s properties through the Document Object Model. The parentis known as the “opener” in the DOM, and the hidden text field is called “hiddenAgent”, forour purposes. The JavaScript function for agreement looks like this:function Agree()

{

opener.document.hiddenAgent.label.value="Agree";

opener.document.confirmAgent.submit();

window.close();

}

Page 100: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 100 / 139

14. Confirmcreate.cgi

Location: cgi-bin folder

Links: personalagent.cgi (6) can be reached via the navigation bar.modify.cgi (15) can be reached via the navigation bar.track.cgi (19) can be reached via the navigation bar.results.html (8) can be reached via the navigation bar.help.html (5) can be reached via the navigation bar.logout.cgi (40) can be reached via the navigation bar.note-create-2.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.writemenu.js is imported to provide the navigation bar code for the page.cookiestuff.js is imported to provide necessary cookie modification code.

Page responsibilities: This script is used to “launch” a new agent. The temporary file ofinformation is added to the agent list as a new member, and the cookie file on the user’scomputer is updated to reflect this new agent. The user may now navigate anywhere inthe interface.

Notes on the Code:This script also updates the ‘numAgents’ variable in the user’s profile file, in order that

the user’s cookie will be updated appropriately.

Page 101: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 101 / 139

15. Modify.cgi

Location: cgi-bin folder

Links: personalagent.cgi (6) can be reached via the navigation bar.create.html (7) can be reached via the navigation bar.track.cgi (19) can be reached via the navigation bar.results.html (8) can be reached via the navigation bar.help.html (5) can be reached via the navigation bar.logout.cgi (40) can be reached via the navigation bar.agent.cgi (23) is called upon “Details” button press.agent_change.cgi (24) is called upon “Change” button press.stopagent.cgi (43) is called upon “Stop” button press.interim.cgi (16) is displayed in a pop-up window if a message is waiting.note-modify.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.writemenu.js is imported to provide the navigation bar code for the page.cookiestuff.js is imported to provide necessary cookie modification code.

Page responsibilities: This script is used to list all active agents, and provide a means for theuser to view their details, edit their details, or stop an agent altogether. The script alsoincludes code to pop up interim messages to the user, should one be waiting.

Page 102: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 102 / 139

16. Interim.cgi

Location: cgi-bin folder

Links: interim_agree.cgi (17) is called upon “Yes, process…” button press.interim_disagree.cgi (18) is called upon “No, do not process…” button press.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays a page that is shown in a pop-up window when aninterim message is received from an agent. The user can choose whether to accept theprocessing of the desired piece of information, or refuse to allow the action.

Page 103: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 103 / 139

17. Interim_agree.cgi

Location: cgi-bin folder

Links: style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays a page in the pop-up interim message window whenthe user chooses to allow their information to be shared, as requested.

Notes on the Code:This script essentially copies the information pertaining to the interim message over into

the user’s default message file for later reference. It then cleans the interim message out of theuser’s interim message file.

Page 104: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 104 / 139

18. Interim_disagree.cgi

Location: cgi-bin folder

Links: style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays a page in the pop-up interim message window whenthe user chooses that they do not wish their information to be shared, as requested.

Notes on the Code:This script simply deletes the unwanted interim message information from the user’s

interim message file.

Page 105: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 105 / 139

19. Track.cgi

Location: cgi-bin folder

Links: personalagent.cgi (6) can be reached via the navigation bar.create.html (7) can be reached via the navigation bar.modify.cgi (15) can be reached via the navigation bar.results.html (8) can be reached via the navigation bar.help.html (5) can be reached via the navigation bar.logout.cgi (40) can be reached via the navigation bar.trackagent.cgi (20) is called upon “Recent Actions” button press.disclosed.cgi (26) can be reached via hyperlink.interim.cgi (16) is displayed in a pop-up window if a message is waiting.note-track.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.writemenu.js is imported to provide the navigation bar code for the page.cookiestuff.js is imported to provide necessary cookie modification code.

Page responsibilities: This script displays a list of the user’s agents in an html page, andallows the user to view their recent actions, or to display a comprehensive list of allagent actions, for the purposes of tracking said agents.

Page 106: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 106 / 139

20. Trackagent.cgi

Location: cgi-bin folder

Links: personalagent.cgi (6) can be reached via the navigation bar.create.html (7) can be reached via the navigation bar.modify.cgi (15) can be reached via the navigation bar.results.html (8) can be reached via the navigation bar.help.html (5) can be reached via the navigation bar.logout.cgi (40) can be reached via the navigation bar.detailsview.cgi (20) is called upon “Details” button press.objection.cgi (26) is called upon “Object” button press.track.cgi (19) can be reached via hyperlink.note-track.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.writemenu.js is imported to provide the navigation bar code for the page.

Page responsibilities: This script displays a list of the recent actions of the selected agent.Users can choose to view the details of the action, or object to the action altogether.This script draws on the output of the background daemon to show agent actions.

Page 107: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 107 / 139

21. Detailsview.cgi

Location: cgi-bin folder

Links: trackagent.cgi (20) is called upon “Return…” button press.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays the details of an agent action, selected by the user.The details are generated by the background daemon process, and are read from textfiles, parsed and then presented in a meaningful way here.

Page 108: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 108 / 139

22. Objection.cgi

Location: cgi-bin folder

Links: trackagent.cgi (20) is called upon “Return…” button press.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays a confirmation page when the user objects to aspecific agent action. From here, the user can return to the ‘recent actions’ page.

Page 109: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 109 / 139

23. Agent.cgi

Location: cgi-bin folder

Links: agent_change.cgi (20) is called upon “Make Corrections” button press.modify.cgi (15) is called upon “Back” button press.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script recalls the details of an agent, selected by the user from the‘modify agent’ page. Options exist to return to the modify page, or to make changes tothe agent instructions.

Page 110: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 110 / 139

24. Agent_change.cgi

Location: cgi-bin folder

Links: modifyagent.cgi (25) is called upon “Save & Submit” button press.modify.cgi (15) is called upon “Cancel” button press.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script provides the user with an html form that can be used tochange the instructions of an agent, selected by the user from the “modify agent” page.

Page 111: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 111 / 139

25. Modifyagent.cgi

Location: cgi-bin folder

Links: modify.cgi (15) is called upon “Back” button press.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This page displays the new instructions provided to the agent, throughthe “modify agent” process, by the user.

Page 112: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 112 / 139

26. Disclosed.cgi

Location: cgi-bin folder

Links: track.cgi (15) can be reached via “Back to Track Agent” hyperlink.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays a complete, dynamic log of all agent actions. Thislist can be sorted by date, agent name, or information type.

Notes on the Code:The script code to sort the columns is perl routine for sorting arrays. The links point to

this same page, and add a variable to the query string. The script parses the query string,extracts the variable, and arranges the array of data accordingly.

Page 113: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 113 / 139

27. Changeprofile.cgi

Location: cgi-bin folder

Links: personalagent.cgi (6) can be reached via “Return to main…” hyperlink.modifyprofile.cgi (28) can be reached via hyperlink.changepass.cgi (30) can be reached via hyperlink.changeprefs.cgi (32) can be reached via hyperlink.createresume.cgi (34) can be reached via hyperlink.changeresume.cgi (36) can be reached via hyperlink..deregister.cgi (40) can be called via hyperlink.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script is used to show a page of links to other scripts that can beused to edit the user’s personal profile. The page also displays a link that can be used todelete the user’s profile from the system in its entirety.

Page 114: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 114 / 139

28. Modifyprofile.cgi

Location: cgi-bin folder

Links: newprofile.cgi (29) is called upon form submission.changeprofile.cgi (27) is called upon “Cancel” button press.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays an html form that can be used to edit the contactdetails of the user.

Notes on the Code:This script opens and parses the user’s profile file to extract the user’s contact

information. This information is then displayed in the text fields for editing.

Page 115: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 115 / 139

29. Newprofile.cgi

Location: cgi-bin folder

Links: changeprofile.cgi (27) is called upon “Back …” button press.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays a confirmation message when a user submits theirnew contact information. This script also updates their stored profile.

Page 116: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 116 / 139

30. Changepass.cgi

Location: cgi-bin folder

Links: newpass.cgi (31) is called upon form submission.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays a simple form in a pop-up window when a userchooses to change his or her password.

Page 117: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 117 / 139

31. Newpass.cgi

Location: cgi-bin folder

Links: style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays a confirmation message in the “change password”pop-up window when the user selects a new password. This script also updates theuser’s profile.

Notes on the Code:This script first checks to see if the new password matches the confirmation password. If

this fails, it displays an error message page. If the passwords match, the script opens the user’sprofile file, extracts the previous password, and compares the old password in the file to the oldpassword that the user supplied on the form. If these do not match, an error message page isdisplayed. If the match is successful, the user’s profile file is updated with the new password.

Page 118: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 118 / 139

32. Changeprefs.cgi

Location: cgi-bin folder

Links: newpreferences.cgi (33) is called upon form submission.changeprofile.cgi (27) is called upon “Cancel” button press.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays an html form of check boxes that allow the user tomodify the existing default agent handling preferences.

Page 119: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 119 / 139

33. Newpreferences.cgi

Location: cgi-bin folder

Links: changeprofile.cgi (27) is called upon “Back …” button press.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays a confirmation message when a user submits theirnew agent handling preferences. This script also updates their stored defaultpreferences.

Page 120: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 120 / 139

34. Createresume.cgi

Location: cgi-bin folder

Links: newresume.cgi (35) is called upon form submission.changeprofile.cgi (27) is called upon “Cancel” button press.sensitivedialog.html (44) is launched by entering data into the Trade Union Membershipbox.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script provides a form that a user can use to create a new CV(résumé).

Notes on the Code:To call the JITCTA once on the first click in the union field, an integer variable is used.

SensitiveDialog.html modifies this value. The code to launch looks like this:var agreeVar = -1;

function SensitiveWindow() {

if (agreeVar == -1) {

SensitiveDialog = window.open('../sensitivedialog.html','newwindow');

}

}

Page 121: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 121 / 139

35. Newresume.cgi

Location: cgi-bin folder

Links: changeprofile.cgi (27) is called upon “Back …” button press.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays a confirmation message when a user submits theirnew résumé. This script adds this new résumé to the list of other résumés.

Notes on the Code:This script is called by two different pages: the createresume.cgi page and the

getresume.cgi page. As such, the code is written to build an array of resumes from the user’sresume file, and the newly submitted resume data. This array is then used to overwrite the oldresume file. In this manner, duplicate resumes can be caught and removed from the list.

Page 122: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 122 / 139

36. Changeresume.cgi

Location: cgi-bin folder

Links: getresume.cgi (37) is called upon form submission.changeprofile.cgi (27) is called upon “Cancel” button press.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script displays a list of the user’s résumés that he or she may selectto edit.

Notes on the Code:The script code parses the user’s CV file to find the names of all of the user’s résumés.

These names are built into a list box for selection.

Page 123: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 123 / 139

37. Getresume.cgi

Location: cgi-bin folder

Links: newresume.cgi (35) is called upon form submission.changeprofile.cgi (27) is called upon “Cancel” button press.sensitivedialog.html (44) is launched by entering data into the Trade Union Membershipbox.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script retrieves the details of a user’s résumé so that they may beedited.

Notes on the Code:This script not only parses the CV information for display; it must first locate the

information first, based only on the name of the CV, provided on the previous page. This iswhy it is important that the script to save the CV’s monitors the CV name variable to preventduplication.

Page 124: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 124 / 139

38. Firstprefs.cgi

Location: cgi-bin folder

Links: registration.cgi (39) is called upon form submission.style.css is established as a link to provide this page with common CSS attributes.

Page responsibilities: This script creates a new profile for a new user, and then displays aform of checkboxes, which the user can use to set their default agent handlingpreferences.

Page 125: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 125 / 139

39. Registration.cgi

Location: cgi-bin folder

Links: login.cgi (41) is called upon form submission.note- register-results.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.cookiestuff.js is imported to provide necessary cookie modification code.

Page responsibilities: This script stores the initial default agent handling preferences, anddisplays an html page that confirms the registration of a new user. A login form is alsoprovided to give access to the system.

Notes on the Code:This script creates a default agent preference file for the user’s agents. It also creates a

random “controller ID” on the fly. On login, it sets a cookie on the user’s computer with theuser’s username.

Page 126: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 126 / 139

40. Deregister.cgi & Logout.cgi

Location: cgi-bin folder

Links: login.html (3) can be reached via hyperlink.note- login.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.cookiestuff.js is imported to provide necessary cookie modification code.

Page responsibilities: Deregister.cgi deletes all of the user’s profile files, and simply redirectsto logout.cgi. Logout.cgi removes all cookies from the user’s computer, and removestheir username from the list of users that the daemon consults to generate messages.

Notes on the Code:Deregister.cgi deletes every file that was created for the user. The command for this is of

the form: unlink "./$COOKIE{'username'}.txt";Logout.cgi edits the “login_allusers.txt” file to remove the user’s ‘username’. This script

calls two functions in cookiestuff.js: deletecookieUser(); and deletecookieNum() to delete thecookies stored on the user’s computer.

Page 127: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 127 / 139

41. Login.cgi

Location: cgi-bin folder

Links: login.html (3) can be reached when login fails.personalagent.cgi (6) is reached under successful login conditions.

Page responsibilities: This script simply compares a user-supplied password with thepassword stored in their profile. If login fails, they are re-directed to the login page. Iflogin succeeds, their username is added to the list of logged-in users, and they are re-directed to the main interface page.

Notes on the Code:Should the login fail, the script redirects to login.html with the following line of code:

print "Location: ../login.html?", "\n\n";

The login.html page recognizes the existence of the “?” in the URL, and displays an errormessage only when it is found.

This script appends the user’s username to the end of the “login_allusers.txt” file in orderthat the daemon may begin to send messages to this user’s agents, should he or she have anyactive ones.

Page 128: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 128 / 139

42. Correct.cgi and Cancel.cgi

Location: cgi-bin folder

Links: personalagent.cgi (6) can be reached via the navigation bar.modify.cgi (15) can be reached via the navigation bar.track.cgi (19) can be reached via the navigation bar.results.html (8) can be reached via the navigation bar.help.html (5) can be reached via the navigation bar.logout.cgi (40) can be reached via the navigation bar.note-create-1.html is displayed to the left if the frame layout is used.style.css is established as a link to provide this page with common CSS attributes.writemenu.js is imported to provide the navigation bar code for the page.

Page responsibilities: Cancel.cgi displays a navigation page. Correct.cgi re-directs tocreate.html. Both scripts are responsible for deleting all temporary files created duringthe agent creation process.

Notes on the Code:During the agent creation process, temporary files are created to make editing easier,and to prevent half-complete agents from receiving messages before they are officially“launched”. When corrections are made, these temporary files are simply deleted, to bereplaced with new data from the user.

Page 129: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 129 / 139

43. Stopagent.cgi

Location: cgi-bin folder

Links: modify.cgi (15) is reached via re-direction.

Page responsibilities: This script removes the deleted agent from the user’s list of activeagents; updates the number of agents so that the next page can update the user’s cookie;and finally, re-directs to modify.cgi.

Page 130: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 130 / 139

44. Sensitivedialog.html

Location: htdocs folder

Links: content on createresume.cgi (34) and getresume.cgi (37) is modified by this page.

Page responsibilities: This pop-up window is used to obtain consent to the processing ofsensitive information, such as membership in a trade union. It appears when a userbegins to enter this information into the appropriate résumé field.

Notes on the Code:A hidden text field exists on both of the CV pages that have a “trade union membership”

text field. This is to track consent to the processing of this type of sensitive information. Asimple “Agree” or “Disagree” is placed in this field, when the user makes a choice in thiswindow. When the form is submitted, the user’s agreement choice is stored along with the restof the CV details that they have provided.

This page also uses the document object model (DOM) to access the variable onCreateresume.cgi (34), which controls the launch of this JITCTA. The JavaScript code tomodify this variable is as follows: opener.agreeVar = 1;

Page 131: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 131 / 139

45. Startdaemon.pl

Location: cgi-bin folder

Responsibilities:This script is responsible for generating messages (interim and otherwise) for users that

are logged into the system.

Notes on the Code:Every fifth message generated by the daemon process is an interim message. This

number can be changed in the code.The script begins an “infinite” loop. It then opens “login_allusers.txt” and seeds the

randomizer function. If there are users signed in, it will choose one at random, and retrievetheir profile to extract the number of agents they own.

If the user owns any agents, a message is generated. Information is pulled from“info_type.txt” and “info_proc.txt”, and a timestamp is generated.

An agent is selected at random from the user’s list of agents, and the associated CV labelis extracted form the file, in case the CV is the “shared” piece of information.

The “message” is now written to the user’s message file, unless it is an interim message,in which case it is written to the user’s interim message file.

The script now picks a random interval of time to “sleep” for, before running the loopagain from the start. The values for the sleep time can be changed in the code.

Should any of the if statements fail, the script will simply sleep, and run again after theinterval. It will eventually meet with success and generate messages.

To quit the daemon process, it can be terminated manually, or a script called“stopdaemon.pl” can be executed, which will modify an external variable that the script needsto consult before it can run it’s “infinite” while loop.

Page 132: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 132 / 139

46. Style.css

Location: htdocs folder

Responsibilities:This file specifies the style properties of the various tags for the interface.

Notes on the Code:This cascading style sheet provides the background properties for the entire interface, as

well as specifying how the paragraph tags will appear.A portion of the code is as follows:

BODY

{

background: #FAEBD0 url(background.jpg);

}

P

{

margin:0in;

font-size:12.0pt;

font-family:"Times New Roman";

}

H1

{

margin:0in;

text-align:center;

font-size:24.0pt;

font-family:"Times New Roman";

color:#CC0000;

}

Page 133: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 133 / 139

47. Writemenu.js

Location: htdocs folder

Responsibilities:This file contains all of the navigation bar code for every page in the interface.

Notes on the Code:Since some navigation links are only available to users who have active agents, the menu

bar must reflect this. Therefore, various menu bars have been created to reflect the differentstates of the interface, and by simply calling the correct id of a specific menu bar, thecorresponding bar will be printed on the page by this script.

The JavaScript code itself is documented to identify the id’s of each menu bar. The codefor one such bar is presented here:

//menutype == 2; plain menu bar, all blocked but create; home selected

if (menutype == 2) {

document.write('<td bgcolor=#3366CC align=center width=50 nowrap>');

document.write('<font color=#ffffff size=-1><b>Home</b></font>');

document.write('</td>'); document.write('<td width=15>');

document.write('&nbsp;'); document.write('</td>');

document.write('<td bgcolor=#DBEAF5 align=center width=100 nowrap

style=cursor:hand;>');

document.write('<a href="../create.html" style=text-

decoration:underline;><font size=-1>Create Agent</font></a>');

document.write('</td>'); document.write('<td width=15>');

document.write('&nbsp;'); document.write('</td>');

document.write('<td bgcolor=#cccccc align=center width=100 nowrap

onClick="disabledField()" style=cursor:default;>');

document.write('<font size=-1>Modify Agent</font>');

document.write('</td>'); document.write('<td width=15>');

document.write('&nbsp;'); document.write('</td>');

document.write('<td bgcolor=#cccccc align=center width=100 nowrap

onClick="disabledField()" style=cursor:default;>');

document.write('<font size=-1>Track Agent</font>');

document.write('</td>'); document.write('<td width=15>');

document.write('&nbsp;'); document.write('</td>');

document.write('<td bgcolor=#cccccc align=center width=100 nowrap

onClick="disabledField()" style=cursor:default;>');

document.write('<font size=-1>Agent Results</font>');

document.write('</td>'); document.write('<td width=15>');

document.write('&nbsp;'); document.write('</td>');

document.write('<td bgcolor=#DBEAF5 align=center width=50 nowrap

style=cursor:hand;>');

document.write('<a href="../help.html" style=text-

decoration:underline;><font size=-1>Help</font></a>');

document.write('</td>'); document.write('<td width=15>');

document.write('&nbsp;'); document.write('</td>');

Page 134: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 134 / 139

document.write('<td bgcolor=#DBEAF5 align=center width=50 nowrap

style=cursor:hand;>');

document.write('<a href="./logout.cgi" style=text-

decoration:underline;><font size=-1>Logout</font></a>');

document.write('</td>');

}

Page 135: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 135 / 139

48. Cookiestuff.js

Location: htdocs folder

Responsibilities:This file contains all of the cookie modification code for the interface.

Notes on the Code:This is fairly self-explanatory JavaScript code, but modifications to this code will result

in numerous errors for the interface if the cookies stop working as they were designed. Everyscript needs to consult the user’s cookies, as this is the only way to maintain state variablesduring a user’s session. SetCookie functions set cookies on the user’s computer, anddeleteCookie functions delete the same cookies. Cookies come in two “flavours”: ‘user’cookies, which store ‘username’ variables, and ‘num’ cookies, which store ‘number of agents’variables. The code is presented here:

// Set and delete a cookie with cookie detection

expireDateUser = new Date

expireDateUser.setMonth(expireDateUser.getMonth()+6)

expireDateNum = new Date

expireDateNum.setMonth(expireDateNum.getMonth()+6)

// Set the cookie

function setCookieUser(cookieName) {

document.cookie = "username="+cookieName+";expires=" +

expireDateUser.toGMTString()+";path=/"

}

function setCookieNum(cookieName) {

document.cookie = "numAgents="+cookieName+";expires=" +

expireDateNum.toGMTString()+";path=/"

}

// Delete the cookie

function deletecookieUser() {

if (document.cookie != "") {

thisCookie = document.cookie.split("; ")

expireDate = new Date

expireDate.setDate(expireDate.getDate()-1)

for (i=0; i<thisCookie.length; i++) {

cookieName = thisCookie[i].split("=")[1]

document.cookie = "username="+cookieName + ";expires=" +

expireDate.toGMTString()+";path=/"

}

}

}

Page 136: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 136 / 139

function deletecookieNum() {

if (document.cookie != "") {

thisCookie = document.cookie.split("; ")

expireDate = new Date

expireDate.setDate(expireDate.getDate()-1)

for (i=0; i<thisCookie.length; i++) {

cookieName = thisCookie[i].split("=")[1]

document.cookie = "numAgents="+cookieName + ";expires=" +

expireDate.toGMTString()+";path=/"

}

}

}

Page 137: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 137 / 139

49. Text file samples

1. Profile file. Contains username, password, # of agents, and contact details.Andrew.txt:username=andrew&password=&email=Your+e-mail+address&name=Andrew+Patrick&address=Your+Address&city=Your+City&province=Your+State%2FProvince&postcode=Your+Postal+Code&phone=Your+Phone+Number&Country=CA&Gender=m&Month=0&Day=0&Year=9999&ControllerID=R6Z5B8&NumAgents=2

2. Default Preferences file. Contains the agent handling preferences that are set atregistration.Andrew_defaultprefs.txt:

ci_employ=on&cv_employ=on&sp_employ=on&ci_month=11&ci_day=11&ci_year=02&cv_month=11&cv_day=11&cv_year=02&sp_month=11&sp_day=11&sp_year=02&ci_transparency=1&cv_transparency=1&sp_transparency=1&ci_access=1&cv_access=1&sp_access=1&ci_modify=1&cv_modify=1&sp_modify=1&ci_erase=1&cv_erase=1&sp_erase=1&ci_transfer=0&cv_transfer=0&sp_transfer=0&preset=on&username=andrew

3. Résumé file. Contains a list of all CV data. (Entries begin with “label=”)Andrew_cv.txt:

label=CV+too&education=*Be+sure+to+include+your+highest+level+of+education+completed+&emp_history=*Indicate+all+relevant+past+work+experiences%2C+with+dates%2C+and+the+positions+you+have+held&skills=*List+all+relevant+skills+you+possess&addinfo=*You+may+enter+any+additional+information+here%2C+or+use+this+space+to+write+a+brief+summary+of+yourself+to+present+to+potential+employers&membership=none&PIIagreement=AGREE

label=test+cv&education=xxx&emp_history=ccc&skills=vcvcvc&addinfo=*bvbv&membership=no+union&PIIagreement=AGREE

label=test+resume&education=xxxxx&emp_history=xxxxxxxxxxxxxxxxxxxx&skills=xxxxxxxxxxxxxxxxxxxxxxxxx&addinfo=xxxxxxxxxxxxxxxxxxx&membership=none&PIIagreement=AGREE

4. Temporary agent creation file. Contains temporary agent information on a new agentbefore it is launched.Andrew_agents_temp.txt:

label=xxx&Task=Job+Seek+Agent&Time=Tue+Dec+10+15%3A25%3A03+EST+2002&sector=&company=&salaryexp=&location=&employer=&CV=CV+too

5. Agent file. Contains a list of active agents and their instructions. (Entries begin with“label=”)Andrew_agents.txt:label=new+test&Task=Job+Seek+Agent&Time=Tue+Dec+10+15%3A01%3A39+EST+2002

Page 138: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 138 / 139

&sector=fishing&company=fishing+boat&salaryexp=lots+of+fish&location=on+the+water&employer=self-employed&CV=CV+too&ci_employ=on&cv_employ=on&sp_employ=on&ci_funds=on&cv_funds=on&sp_funds=on&ci_in_market=on&cv_in_market=on&sp_in_market=on&ci_ex_market=on&cv_ex_market=on&sp_ex_market=on&ci_list_broker=on&cv_list_broker=on&sp_list_broker=on&ci_month=12&ci_day=10&ci_year=03&cv_month=12&cv_day=10&cv_year=03&sp_month=12&sp_day=10&sp_year=03&ci_transparency=0&cv_transparency=0&sp_transparency=0&ci_access=0&cv_access=0&sp_access=0&ci_modify=0&cv_modify=0&sp_modify=0&ci_erase=0&cv_erase=0&sp_erase=0&ci_transfer=1&cv_transfer=1&sp_transfer=1&preset=on

label=Ottawa+jobs&Task=Job+Seek+Agent&Time=Tue+Dec+10+15%3A03%3A09+EST+2002&sector=High+tech+or+government&company=profitable&salaryexp=100%2C000&location=Ottawa&employer=Nortel+Networks&CV=test+resume&ci_employ=on&cv_employ=on&sp_employ=on&ci_month=01&ci_day=10&ci_year=03&cv_month=01&cv_day=10&cv_year=03&sp_month=01&sp_day=10&sp_year=03&ci_transparency=1&cv_transparency=1&sp_transparency=1&ci_access=1&cv_access=1&sp_access=1&ci_modify=1&cv_modify=1&sp_modify=1&ci_erase=1&cv_erase=1&sp_erase=1&ci_transfer=0&cv_transfer=0&sp_transfer=0&preset=on

6. Messages file. Contains all the information for received messages. This is a daemon-generated file, unlike the previous files, which are created from CGI form submission strings.The data is arranged in a sequential array. A new entry begins every six lines.Andrew_messages.txt:id=5512/10/0216:34:49Résumé - test resumeCANCEL LIMIT-RCMPOttawa jobsid=6012/10/0216:57:41Search InstructionsLIMIT-Job MarketOttawa jobsid=6512/10/0217:30:13Résumé - test resumeJob SeekOttawa jobs

7. Interim messages file. Also a daemon-generated file, the interim messages file looksexactly like the regular ‘messages’ file. Interim messages are kept in a separate file becausethey are processed differently on arrival.Andrew_interim.txt:See Andrew_messages.txt for format.

Page 139: BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE · Page 1 / 139 BUILDING A PRIVACY GARDIAN FOR THE ELECTRONIC AGE Project number IST - 2000 - 26038 Project title PISA Deliverable

D23: Agent User Interfaces and Documentation

Page 139 / 139