veoc assumptions and requirements document v5veoc/resources/design... · services, infrastructure,...
TRANSCRIPT
Last Modified: Jan 12, 2012
virtual Emergency Operations Center (vEOC)
Assumptions and Requirements Document
2
Purpose: This document outlines the assumptions and requirements underlying the design of the vEOC.
Table of Contents Table of Contents ............................................................................................................................... 2
1 Project Goals ................................................................................................................................ 5
2 Project Features ........................................................................................................................... 5
3 Project URL ................................................................................................................................... 5
4 Collaborators ................................................................................................................................ 5 4.1 Florida International University .............................................................................................................................................. 5 4.2 Emory University ........................................................................................................................................................................... 6 4.3 Notre Dame ....................................................................................................................................................................................... 6 4.3.1 Matt Mooney .................................................................................................................................................................................. 6 4.3.2 Center for Research Computing (CRC) ............................................................................................................................... 6 4.3.3 REU Students ................................................................................................................................................................................. 6
5 Virtual Teamwork ......................................................................................................................... 6
6 Grants .......................................................................................................................................... 6
7 Development ................................................................................................................................ 6 7.1 Spiral Design .................................................................................................................................................................................... 6 7.2 Ensayo Developmental Lifecycle ............................................................................................................................................. 7 7.3 User-‐Centered Application Design ......................................................................................................................................... 8 7.4 Expert Validation ........................................................................................................................................................................... 8 7.5 Technologies Employed .............................................................................................................................................................. 9 7.6 Jetty Server ....................................................................................................................................................................................... 9 7.7 Virtual Machines ............................................................................................................................................................................. 9 7.8 Secure Socket Layer (SSL) .......................................................................................................................................................... 9 7.9 Redmine Server .............................................................................................................................................................................. 9
8 WebEOC-‐like Console ................................................................................................................... 9
9 Assumptions ............................................................................................................................... 12 9.1 EOC .................................................................................................................................................................................................... 12 9.2 Day-‐to-‐day Operations .............................................................................................................................................................. 12 9.3 Miami-‐Dade Incident Command Structure ...................................................................................................................... 13 9.4 EOC Floor Plan .............................................................................................................................................................................. 16
10 Incident Command ................................................................................................................... 17
11 User Views ............................................................................................................................... 17 11.1 Trainee .......................................................................................................................................................................................... 17 11.2 Observer ....................................................................................................................................................................................... 17 11.3 Scenario Manager ..................................................................................................................................................................... 17 11.4 Administrator ............................................................................................................................................................................ 17 11.5 Staff Member .............................................................................................................................................................................. 17 11.6 Researcher ................................................................................................................................................................................... 17
3
11.7 Exercise Controller .................................................................................................................................................................. 18
12 Trainee Positions ...................................................................................................................... 18 12.1 Liaisons ......................................................................................................................................................................................... 18 12.2 Logistics ........................................................................................................................................................................................ 18 12.3 Planning ........................................................................................................................................................................................ 18 12.4 Section Chiefs ............................................................................................................................................................................. 18 12.5 Elected Officials ......................................................................................................................................................................... 18 13 Mental Models ......................................................................................................................... 18 13.1 Trainee .......................................................................................................................................................................................... 18 13.2 Staff Member .............................................................................................................................................................................. 19 13.3 Observer ....................................................................................................................................................................................... 20 13.4 Administrator ............................................................................................................................................................................ 20 13.5 Researcher ................................................................................................................................................................................... 20 13.6 Scenario Manager ..................................................................................................................................................................... 21
14 Concept Maps ........................................................................................................................... 21 14.1 Emergency Manager Concept Map ................................................................................................................................... 21 14.2 Exercise Developer Concept Map ...................................................................................................................................... 23 14.3 Planning Concept Map ............................................................................................................................................................ 24 14.4 Exercise Controller Concept Map ...................................................................................................................................... 24 14.5 Trainee Concept Map .............................................................................................................................................................. 24
15 Types of Training ...................................................................................................................... 25 15.1 Individual ..................................................................................................................................................................................... 25 15.2 Groups ........................................................................................................................................................................................... 25 15.3 Organizational ........................................................................................................................................................................... 25 15.4 Discussion-‐based ...................................................................................................................................................................... 25 15.5 Operational-‐based .................................................................................................................................................................... 25 15.6 Seminars ....................................................................................................................................................................................... 25 15.7 Train the Trainer ...................................................................................................................................................................... 25 15.8 Workshops .................................................................................................................................................................................. 25 15.9 Tabletop Exercises ................................................................................................................................................................... 25 15.10 Games .......................................................................................................................................................................................... 25 15.11 Drills ............................................................................................................................................................................................ 25 15.12 Functional Exercises ............................................................................................................................................................. 25 15.13 Full-‐scale Exercises ............................................................................................................................................................... 25
16 Processes, Requirements, and Assumptions ............................................................................. 27 16.1 Login Process ............................................................................................................................................................................. 27 16.2 Logistics Process ....................................................................................................................................................................... 29 16.3 Mission Task Requirements ................................................................................................................................................ 30 17 Software Architecture .............................................................................................................. 30 17.1 Trainee .......................................................................................................................................................................................... 30 17.2 Exercise Developer .................................................................................................................................................................. 30 17.3 Researcher ................................................................................................................................................................................... 31
18 UML Diagrams .......................................................................................................................... 32
19 Purposes of CIMS ...................................................................................................................... 39
4
20 System Capabilities ................................................................................................................... 39
21 Software Call Graph .................................................................................................................. 49
22 Testing ..................................................................................................................................... 49 22.1 Manual Testing .......................................................................................................................................................................... 49 22.2 Automated Testing .................................................................................................................................................................. 49 22.3 Automated Testing Tools ...................................................................................................................................................... 49 22.4 Master Test Plan ....................................................................................................................................................................... 50
23 Future Work ............................................................................................................................. 50
Appendix A: Master Test Plan .......................................................................................................... 51
5
1 Project Goals To build a virtual Emergency Operations Center for: • Training emergency personnel • Research into emergency management decision making
2 Project Features • Distributed • Web-‐based • Intelligent agents that can supplant human trainees • Interactive Advisor • Dashboards
3 Project URL The project URL is http://veocdev.crc.nd.edu/veoc/RegularLogin2.php. Note: you have to be connected to the Notre Dame campus VPN in order to access this URL. (the development server)
4 Collaborators We are collaborating with Florida International University and Emory University in the development of Ensayo.
4.1 Florida International University • Prof. Irma Becerra-‐Fernandez • Prof. Wiedong Xia • Denni Florian ([email protected])
location
people
6
4.2 Emory University • Prof. Michael Prietula • Rose (Qiuzhi Chang) ([email protected])
4.3 Notre Dame
4.3.1 Matt Mooney
4.3.2 Center for Research Computing (CRC) • David Janosik • Anna Alber • Dr. Tim Wright
4.3.3 REU Students We also would like to acknowledge all of the REU students who helped with this project as well.
5 Virtual Teamwork We have weekly teleconferences with our collaborators (see example agenda). We usually set this up via skype on Tuesdays. One person in the group usually prepares and agenda and writes up the what was accomplished upon completion of the meeting.
6 Grants Project Ensayo has been supported under the following grants: NSF, GAANN, Zahm.
7 Development
7.1 Spiral Design Ensayo has been developed using the spiral design model. The spiral model is an iterative model in which development proceeds through incremental releases of the system. The lifecycle usually beginning with a prototype and proceeds in iterations as outlined in figure 1 below.
7
Figure 1*: Spiral Lifecycle
*Source: Wikipedia -‐ touched up figure of Boehm original
7.2 Ensayo Developmental Lifecycle Specifically, Ensayo has a multi-‐tier lifecycle. (See Figure 2 below). There are five main tiers: simulated training for one individual versus the entire organization, artificial simulation of one individual versus the entire organization, different user roles (see subsequent section on User Views). various trainee positions, and finally, system modules. We began the design by simulated training for the entire organization. That is, training is designed for the entire organization to practice an exercise. After completing the training simulation of the entire organization, we will simulate training for selective groups of the Incident Command System (ICS). Finally, we will simulate training for one individual. Accordingly, simulation of artificial agents follows the opposite spectrum. First, we begin with no artificial agents. Next we simulate an individual agent. Then we simulate a small group of agents. Finally, we simulate the entire organization with artificial agents. In terms of different user roles, we began by designing the consoles for the users in the following order: Trainee Console, Exercise Developer Console, Researcher Console, Administrator Console, Staff Console. We began simulating trainee positions by simulating the public liaisons. Next, we simulated the logistics section. Third, we moved to the Section Chiefs. Lastly, we simulated the incident command, which
8
includes the mayor and incident commander(s). Finally, we designed various architectural modules.
7.3 User-‐Centered Application Design • Content: the features in the design • Functionality: what the program is capable of doing • Aesthetics: how pleasing the user-‐interface is to the eye • Usability: the user-‐friendliness of the design
7.4 Expert Validation In order to validate the system and obtain an expert subject matter knowledge base, we are working with one of the foremost emergency operations centers in the country – the Miami-‐Dade EOC.
!!!!!!!!!!!!!!
!
!!!
!!!!
!!
!
Logistics!Module!Decision!Support!
File!System!Tutorial!Module!
Scratchpad!! ! ! ! ! Interactive!Advisor!! ! ! ! ! ! Administration/Configuration!Module!! ! ! ! ! ! ! Access!Control!! ! ! ! ! ! ! ! Redundant!Database!! ! ! ! ! ! ! ! ! Backup!Database!
One!Individual!
Groups!of!Individuals!
Entire!Organization!
Liaison!!
Incident!Commander!!
Logistics!!
Section!Chiefs!!
Mayor!!
Trainee!Console!!G!Primary!Database!G!!Communications!Module!!
Scenario!Manager!G!!Exercise!Driver!!
Researcher!Console!!G!Research!Module!!
Staff!Console!!G!Research!Module!!
Administrator!Console!!G!Research!Module!!
9
7.5 Technologies Employed • On the client side, technologies include XHTML, CSS, Dynamic HTML, AJAX, Reverse AJAX,
and JavaScript. • On the server side, technologies employed are PHP, MySQL, DOJO and the Jetty server.
7.6 Jetty Server We are using the Jetty Server version 6.1.14. The Jetty Server was chosen because of the DOJO toolkit/Reverse AJAX capabilities incorporated in the server. The Jetty website is http://jetty.codehaus.org/jetty/.
7.7 Virtual Machines We are using virtual machines in the development of Ensayo. There are several reasons. First, these virtual machines have provided the ability to sustain hardware failures. In case of a hardware failure, Ensayo can be up and running in a matter of minutes, as apposed to much longer recovery time associated with using the direct underlying hardware. Second, it is much more simple to backup the virtual machines than to backup traditional hardware. Furthermore, the state of Ensayo and the statespace of the underlying (virtual) hardware is backed-‐up in addition to the project files. This also simplifies failure recovery. Finally, virtual machines allow for easy installation and setup of Ensayo; Rather than needing to configure databases and servers, the user simply can play a virtual machine and have instant configuration and setup of Ensayo.
7.8 Secure Socket Layer (SSL) For security enhancements, we are using port 443. All transmissions go through the Secure Socket Layer in order to access Ensayo.
7.9 Redmine Server Development is coordinated through the Redmine Server at Notre Dame.
8 WebEOC-‐like Console The vEOC is based on the WebEOC console. This is the Crisis Information Management System that Miami-‐Dade County uses at the EOC. It is composed of tabbed-‐based browsing and dynamically configurable boards.
10
Figure 1: Example screenshots from the WebEOC console.
In the original prototype, the system used a tab-‐based approach and stored the information in the client’s web-‐browser. Because there is a lot of information that emergency managers need some of the time, tabbed interfaces can waste space and become overloaded. An alternative to tabs, in a windows-‐based approach, each menu item creates a new pop-‐up window. This allows individuals to have more information readily available and to easily switch between various statuses.
11
Figure 2: Example vEOC console.
12
Figure 3: Example vEOC status boards.
9 Assumptions
9.1 EOC An Emergency Operations Center is a secure location where upper-‐level emergency officials gather to prepare for, manage, and coordinate the response to an incident (e.g. tsunami, earthquake, hurricane, pandemic).
9.2 Day-‐to-‐day Operations In day-‐to-‐day operations, emergency management staff are involved in preparedness and mitigation strategies for future crises (ICS 1 2002). They are organized into Health and Human Services, Systems, Planning and Preparedness, Infrastructure and Recovery, and Personnel and Administration. See Figure 4.
13
Figure 4: Department of Emergency Management organizational structure. This is the day-‐to-‐day structure of emergency
management at the Emergency Operations Center.
9.3 Miami-‐Dade Incident Command Structure When a disaster strikes, however, the emergency management staff drop their day-‐to-‐day roles and take on the role assigned to them by the Incident Commander. This role usually involves leading a section or branch of the incident command system or ICS (Johnson 2010). There are four main branches in accordance with ICS. The main sections are operations, planning, logistics, and finance/administration. (See Figure 4)
DirectorCurtis Sommerhoff
Homeland security program
manageer mdpd lieutenant
Efren lopez
deputy directorjonathan lord
external affairs coord
em governmental coord
david perez
executive secretary
lettie cogdell
public information
officerjamie hernandez
health & human services bureau
managerVolunteer
managementNixsa serrano
systems managerSSA/p
gissoheila ajabshir
planning & preparedness
bureau managertechnical hazards
Niel batista
infrastructure & recovery bureau
managerLocal mitigation
strategyFrank reddish
personnel & admin manager
Spa1finance & human
resourcesPamela broaster-
doyle
empa/empg projectgrant funded
tempmonique lopez
critical infrastructure em coordinator
raymond misomali
logistics & podsem coordinator
craig hall
recoveryem coordinator
paul vitro
dae & strategic planning
Em coordinatoranjila lebsock
cemp, coop, & evacem coordinatorsherry capers
special hazards & ccgp planning
em coordinatorroslyn viterbo
training & exercise
em coordinatortroy johnson
eoc readinessem coordinatoradrian walker
regional em planner
em specialistgrant funded
vacant
eoc readinessem coordinator
vacant
health servicesem coordinator
lorenzo sanchez
community prep & ada
em coordinatorMirtha gonzalez
psn & rhcfem coordinatorroberto cepeda
mass care & ambulance contract
em coordinatorcharles cyrille
14
At Miami-‐Dade County, Operations is further organized into four branches: Public Safety, Human Services, Infrastructure, and Municipal. Planning consists of Geographic Information Systems (GIS), the 311 Public Information Call Center, and three units to aid in incident planning and documentation. Finally, Logistics is divided into EOC Support and Disaster Resources. See Figure 5. The operations, planning, logistics, and finance/administration sections constitute the general staff. Leading the general staff and assuming responsibility for the incident is the Incident Commander. See Figure 5. The Incident Commander has additional support staff as well, called the command staff, which includes a public safety officer, a public information officer, and a liaison officer. (Irwin 1989).
15
Figure 5: Incident Command System (IS-‐100.a 2008). The command staff, the general staff, and the agency liaisons assist
the incident commander during an emergency.
16
9.4 EOC Floor Plan
Figure 6: Miami-‐Dade EOC Activation Floor Plan.
Mia
mi-
Dad
e Em
erge
ncy
Ope
rati
ons
Cen
ter
Act
ivat
ion
floo
r pl
an
Broward
County EM (REP Only)
Martin
County EM (REP Only)
Collier
County EM (REP Only)
Homestead Air Reserve Base (REP)
Intergovern-
mental Coordinator
MIT-Tech Support
MIT-
Application Support
Airports
Florida City (REP Only)
Monroe County
EM
Monroe County
EM
Miami Beach Divisional
EOC
North Miami Divisional
EOC
N Miami Bch Divisional
EOC
Homestead Divisional
EOC
Coral Gables Divisional
EOC
Hialeah Divisional
EOC
Florida DEM
Miam
i-Dad
e Fire
Re
scue
Dep
t.
Public Safety Manager
Public Safety Assistant
Miam
i-Dad
e Po
lice
Depa
rtmen
t
Miam
i-Dad
e Fire
Re
scue
Dep
t.
Miam
i-Dad
e Po
lice
Depa
rtmen
t
Florid
a Dep
t. of
Law
Enfor
ceme
nt
U.S.
Coa
st Gu
ard
DE
RM
Florid
a High
way
Patro
l
Natio
nal P
ark
Servi
ce
Miam
i-Dad
e Sc
hools
Poli
ce Flo
rida
Fish a
nd
Wild
life
Comm
ission
Miam
i-Dad
e Co
rrecti
ons
Dept.
Florida National Guard
Animal Services
PUBL
IC S
AFET
Y FU
NCT
ION
AL G
ROU
P
Operations Section
Assistant
Operations Section
Manager
podium
EOC Support Manager
Planning: Situation
Assessment
Miam
i-Dad
e Pu
blic S
choo
ls
Human Services Manager
Human Services Assistant
Miam
i-Dad
e Fir
e Res
cue
EMS
Amer
ican
Red C
ross
Spec
ial N
eeds
Co
ordin
ator
Miam
i-Dad
e He
alth
Depa
rtmen
t
Salva
tion
Army
Gr
eater
Miam
i Co
nven
tion
& Vi
sitor
s Bur
eau
AHCA
Dept.
of H
uman
Se
rvice
s
Miam
i-Dad
e Ho
using
Age
ncy
Team
Me
tro
Florid
a Dep
t. of
Child
ren &
Fa
milie
s
VOAD Mental Health (REP Only-BRC)
HU
MAN
SER
VICE
S FU
NCT
ION
AL G
RO
UP
Miam
i-Dad
e Pu
blic S
choo
ls
Infra-structure Manager
Infrastructure Assistant
Bell S
outh
Miam
i-Dad
e Tr
ansit
-Ev
acua
tion
Florid
a Pow
er
& Lig
ht
City
Gas
So. F
L Wate
r Ma
nage
ment
Distr
ict
Miam
i-Dad
e Tr
ansit
- Re
gular
Svs
.
Miam
i-Dad
e ET
SD
Miam
i-Dad
e So
lid W
aste
Dept.
Miam
i-Dad
e Pa
rks D
ept.
Miam
i-Dad
e W
ater &
Se
wer
Agric
ultur
e Ex
tensio
n Miami-Dade Public Works
Florida Dept. of
Transportation
INFR
ASTR
UCT
URE
FU
NCT
ION
AL G
ROU
P
Port
CDivisional ity of Miami
EOC
of Miami
NOTE: REP Only=Agencies that only are present for radiological emergencies.
04/25
/07
17
10 Incident Command In the development of Ensay, we modeled the command structure after Miami-‐Dade Counter Emergency Operations Center’s Incident Command System. This is a hybrid of NIMS (ESFs) and ICS. The ESFs are incorporated under the operations section of the ICS.
11 User Views There are 6 different user views in the vEOC, that is, there are 6 main roles that a user may exercise: the trainee, the observer, the scenario manager, the staff member, the administrator, and the researcher.
11.1 Trainee The trainee prepares for emergency situations and practices decision making by interacting with the vEOC. The trainee has the ability to modify all status in the vEOC.
11.2 Observer The observer prepares for emergency situations by watching the trainees and other personnel in the vEOC; He/she does not have the ability to modify status in the vEOC other than to read and search through archives.
11.3 Scenario Manager The scenario manager creates scripts to train emergency personnel. The scenario manager also moderates the exercise/training sessions. The scenario manager is essentially equivalent to the controller in the functional exercise, with the exception that injects are automatically presented to the trainee. The scenario manager has the greater ability/responsibility to begin, pause, and terminate the exercise. The scenario manager can also speed up or slow down the exercise aswell.
11.4 Administrator The administrator maintains the vEOC software. The administrator also sets up and moderates user profiles.
11.5 Staff Member Staff members are upper level EOC sta_. For example, they may be EOC planning section personnel. Staff members can print reports and analyze performance and decision making of the EOC personnel.
11.6 Researcher Researchers are individuals interested in studying various aspects of decision making and emergency response. They typically are not EOC personnel.
18
11.7 Exercise Controller Needs to be developed
12 Trainee Positions
12.1 Liaisons
12.2 Logistics
12.3 Planning
12.4 Section Chiefs
12.5 Elected Officials
13 Mental Models
13.1 Trainee
19
13.2 Staff Member
20
13.3 Observer
13.4 Administrator
13.5 Researcher
21
13.6 Scenario Manager
14 Concept Maps
14.1 Emergency Manager Concept Map
22
Figure 7: An emergency manager concept graph. This concept map shows the various functions that emergency managers engage in on an on-‐going basis as well as during a crisis. The main activities center around both information and people.
23
14.2 Exercise Developer Concept Map
Figure 8: An exercise developer concept graph.
This concept map shows the various functions that exercise developers engage in when creating an exercise.
Exercise Developers
controller handbooks
player handbooks
evaluator handbooks
scripts
flow of the exercise
injects
players
evaluation metrics
objectives
past exercises
reports
indiviuals
groups
organization as a whole
develop
moderate
determine
based on
consisting of
sent totaken from
target capabilities
24
14.3 Planning Concept Map Needs to be mapped out
14.4 Exercise Controller Concept Map Needs to be mapped out better
14.5 Trainee Concept Map Needs to be mapped out
25
15 Types of Training
15.1 Individual
15.2 Groups
15.3 Organizational
15.4 Discussion-‐based
15.5 Operational-‐based
15.6 Seminars
15.7 Train the Trainer
15.8 Workshops
15.9 Tabletop Exercises
15.10 Games
15.11 Drills
15.12 Functional Exercises
15.13 Full-‐scale Exercises
26
27
16 Processes, Requirements, and Assumptions
16.1 Login Process
28
29
16.2 Logistics Process
Figure 10: The logistics resource request process.
This process was obtained through interviews with Craig Hall, the Logistics section chief at Miami-‐Dade. This process outlines the roles of the Logistics Section Chief, the Government Services Agency, and Department of Procurement Management in obtaining a resource. It also outlines how Miami-‐Dade county attempts to fill the request in-‐house first and then if this is not possible, it upchannels the request to the state (through the EM Constellation software program).
30
16.3 Mission Task Requirements • Anyone can assign a mission/task to anyone else • Other people should only see the mission/tasks for which they are assigned and which they
assign to others • Only the person who created the mission/task should be able to edit it • The person to which it is assigned should be able to change the status of the mission/tasks • It would be nice to have a pop up box alerting the person when a person is assigned a new
mission/task.
17 Software Architecture
17.1 Trainee
17.2 Exercise Developer
31
17.3 Researcher
32
18 UML Diagrams
-username-role-password
Login(RegularLogin2.php)
-username-role
Logout(logout.php)
Authentication(Authenticate.php)
Customize Screens (user resizes windows)
vEOC
-pick group-pick individual within group
-pick scripts
Pick Player(index.php)
Interact with Console
(mainpanel.php and exercisepanel.php)
33
-cus
tom
ize
scre
en
-cur
rent
Stat
e
Desk
top
Hist
ory
Repo
rts
Dash
boar
d
Inte
ract
ive
Advi
sor
-vie
w-e
nlar
ge-d
ecre
ase
-upl
oad
file
-dow
nloa
d fil
e
-cur
rent
File
C
urre
nt S
tatu
s
cha
nge
stat
us-re
ques
t res
ourc
e-a
nsw
er c
omm
unic
atio
n m
ediu
m-re
spon
d to
reso
urce
requ
est
-requ
est r
esou
rce
-upd
ate
stat
us-o
pen
com
mun
icat
ion
tool
-log
even
t (au
tom
atic
?)
-cur
rent
Stat
e
Cons
ole
Man
ager
Sear
ch
-sav
e-s
earc
h-p
rint
Log
Basi
c Se
arch
Adva
nced
Sea
rch
Infra
stru
ctur
e
Hum
an S
ervi
ces
Publ
ic S
afet
y
Spec
ial N
eeds
Hurr
evac
SLO
SH
SALT
eTea
m
Basi
c Lo
g
Adva
nced
Log
Repo
rt Te
mpl
ate
Repo
rt Li
brar
y
Ad-h
oc R
epor
t
Repo
rt Pr
evie
w
-ope
n-c
lose
Exte
rnal
Sof
twar
e In
terf
ace
-sen
d-re
ceiv
e
PDA
-list
en-ta
lk
radi
o
-talk
-list
en
face
to fa
ce
-sen
d m
essa
ge-re
ceiv
e m
essa
ge
phon
e
-sen
d-re
ceiv
e
Com
mun
icat
ion
Med
ia
-list
en-ta
lk
Land
line
-list
en-ta
lk-te
xt m
essa
ge-e
mai
l
Cellu
lar
-sen
d-re
ceiv
e-s
ave
-edi
t-d
elet
e
emai
l -ope
n-c
lose
inte
rnet
-ope
n -c
lose
Com
mun
icat
ion
Tool
s
-ope
n po
wer
poin
t?-c
lose
pro
gram
-sav
e fil
e-e
dit fi
le-o
pen
file
publ
ish
file
stat
us re
ports
-requ
est r
esou
rce
-app
rove
reso
urce
-log
reso
urce
-bud
get r
esou
rce
requ
est r
esou
rces
T
rain
ee
-get
Tim
eDiff
eren
t(s
tartT
ime,
curre
nttim
e) -c
urre
ntTi
me
-sta
rtTim
e
Tim
er
-get
Even
tDiff
eren
ce(c
urre
ntEv
ent,
corre
ctEv
ent)
-cur
rent
Even
t-c
orre
ctEv
ent
Hint
Man
ager
-reg
iste
r for
eve
nts
On-
Load
34
-cus
tom
ize
scre
en
-cur
rent
Stat
e
Desk
top
Hist
ory
Repo
rts
Dash
boar
d
Inte
ract
ive
Advi
sor
-vie
w-e
nlar
ge-d
ecre
ase
-upl
oad
file
-dow
nloa
d fil
e
-cur
rent
File
C
urre
nt S
tatu
s
-cha
nge
stat
us-re
ques
t res
ourc
e-a
nsw
er c
omm
unic
atio
n m
ediu
m-re
spon
d to
reso
urce
requ
est
-requ
est r
esou
rce
-upd
ate
stat
us-o
pen
com
mun
icat
ion
tool
-log
even
t (au
tom
atic
?)
-cur
rent
Stat
e
Cons
ole
Man
ager
Sear
ch
-sav
e-s
earc
h-p
rint
Log
Basi
c Se
arch
Adva
nced
Sea
rch
Infra
stru
ctur
e
Hum
an S
ervi
ces
Publ
ic S
afet
y
Spec
ial N
eeds
Hurr
evac
SLO
SH
SALT
eTea
m
Basi
c Lo
g
Adva
nced
Log
Repo
rt Te
mpl
ate
Repo
rt Li
brar
y
Ad-h
oc R
epor
t
Repo
rt Pr
evie
w
-ope
n-c
lose
Exte
rnal
Sof
twar
e In
terf
ace
-sen
d-re
ceiv
e
PDA
-list
en-ta
lk
radi
o
-talk
-list
en
face
to fa
ce
-sen
d m
essa
ge-re
ceiv
e m
essa
ge
phon
e
-sen
d-re
ceiv
e
Com
mun
icat
ion
Med
ia
-list
en-ta
lk
Land
line
-list
en-ta
lk-te
xt m
essa
ge-e
mai
l
Cellu
lar
-sen
d-re
ceiv
e-s
ave
-edi
t-d
elet
e
emai
l -ope
n-c
lose
inte
rnet
-ope
n -c
lose
Com
mun
icat
ion
Tool
s
-ope
n po
wer
poin
t?-c
lose
pro
gram
-sav
e fil
e-e
dit fi
le-o
pen
file
publ
ish
file
stat
us re
ports
-requ
est r
esou
rce
-app
rove
reso
urce
-log
reso
urce
-bud
get r
esou
rce
requ
est r
esou
rces
Obs
erve
r
-get
Tim
eDiff
eren
t(s
tartT
ime,
curre
nttim
e) -c
urre
ntTi
me
-sta
rtTim
e
Tim
er
-get
Even
tDiff
eren
ce(c
urre
ntEv
ent,
corre
ctEv
ent)
-cur
rent
Even
t-c
orre
ctEv
ent
Hint
Man
ager
-reg
iste
r for
ev
ents
On-
Load
35
-Cal
enda
r-C
hat
-Inst
ant
Mes
seng
er-N
otes
-Blo
g
New
sroo
m
Mod
ule
-Beg
in-P
ause
-Sto
p-E
nd-F
ast T
ime
-Nor
mal
Tim
e-N
ext E
vent
Con
trol
Mod
ule
-Cre
ate
Scrip
t-E
dit S
crip
t-D
elet
e Sc
ript
-Sav
e Sc
ript
-Impo
rt Sc
ript
-Exp
ort S
crip
t-P
rint S
crip
t
Libr
ary
Search
Bas
ic S
earc
h
Adv
ance
d Se
arch
Scen
ario
Man
ager
Reports
-sav
e-p
rint
-sav
e as
tem
plat
e
Ad-
hoc
Rep
ort
-new
-ope
n-s
ave
-edi
t-d
elet
e
Rep
ort T
empl
ate
36
-Add Member-Edit Member-Delete Member-Virtualize Member-De-virtualize Member-Access Control
User Profile
Environmental
Agent
Virtualize
De-virtualize
Administrator
37
Newsroom
Data Collection
Reports
Ad-hoc Report
Report Template
News
Chat
Statistics
Calendar
-new-open-save-edit-delete-import-export
Report LibraryStaff Member
Search
Advanced
Basic
38
New
sroo
m
Dat
a C
olle
ctio
n
Reports
Ad-
hoc
Rep
ort
Rep
ort T
empl
ate
New
s
Cha
t
Stat
istic
s
Cal
enda
r
Rep
ort L
ibra
ry
Doc
umen
ts
-Upl
oad
file
-Dow
nloa
d fil
e-N
ew fo
lder
-Del
ete
fold
er-E
dit f
olde
r-N
ew fi
le-E
dit fi
le-D
elet
e fil
e
Libr
ary
Sear
ch
Bas
ic S
earc
h
A
dvan
ced
Sear
ch
Col
lect
ion
Tem
plat
e
Res
earc
her
39
19 Purposes of CIMS Table 1: CIMS throughout the Emergency Response Lifecycle (Governmental/Emergency
Managers)
Phase of Emergency Response Lifecycle
Primary Role of CIMS
Secondary Role of CIMS
Example Applications
Mitigation (day to day) Updating the CIMS Data Repository WebEOC Preparedness (day to day)
-‐Risks assessment -‐Planning -‐Analysis -‐Training -‐Policy Development
Data Repository Ensayo, WebEOC
Response (during a crisis)
-‐Command and control -‐Common operating picture -‐Decision support -‐Coordination -‐Information control and dissemination
-‐Resource acquisition, allocation, and exchange -‐Documentation
WebEOC, Sharepoint
Recovery (following a crisis)
-‐Coordination -‐Information exchange -‐Resource acquisition, allocation, and exchange -‐Documentation
-‐Common Operating Picture -‐Decision Support
Sahana
Additional CIMS Characteristics
-‐Used throughout the emergency response lifecycle -‐Easy to learn/use -‐Adaptable -‐Easy to install Backup -‐Interoperable (with municipalities and state and national/international networks) -‐Trust in CIMS -‐Distributed
20 System Capabilities 1. Trainee
1.1. User Tutorial
40
1.1.1. Voiceover 1.1.2. Update to Maya?
1.2. Login 1.2.1. Main browser login message 1.2.2. User Login 1.2.3. Position Login
1.2.3.1. Select Role to Be 1.2.4. Select Script to Use for Exercise
1.3. Common Operating Picture 1.3.1. Starting Status
1.3.1.1. Check Starting Status
1.3.1.2. Create ability to have multiple starting statuses 1.3.2. Exercise Background
1.3.2.1. View Player Handbook 1.3.2.2. View EOC Floor Plan
1.3.2.2.1. Update to interactive floor plan 1.3.3. Road Closures
1.3.3.1. Check Status of Road Closures 1.3.3.2. Create Status of Road Closures
1.3.3.3. Edit/Update status of Road Closures 1.3.3.4. Dynamic status updates
1.3.4. Shelters 1.3.4.1. Check Status of Shelters 1.3.4.2. Create Status of Shelters 1.3.4.3. Edit/Update status of Shelters 1.3.4.4. Dynamic status updates
1.3.5. Hospitals 1.3.5.1. Check Status of Hospitals 1.3.5.2. Create Status of Hospitals 1.3.5.3. Edit/Update status of Hospitals 1.3.5.4. Dynamic status updates
1.3.6. Points of Distribution (PODs) 1.3.6.1. Check Status of PODs 1.3.6.2. Create Status of PODs 1.3.6.3. Edit/Update status of PODs 1.3.6.4. Dynamic status updates
1.3.7. Disaster Map 1.3.7.1. View the Disaster Map 1.3.7.2. Edit/Update Disaster Map 1.3.7.3. Clear Disaster Map
41
1.3.7.4. Save Disaster Map 1.4. Mission/Tasks
1.4.1. Create a Mission/Task 1.4.2. Edit/Update Mission/Task 1.4.3. Delete Mission/Task 1.4.4. Dynamic status updates
1.5. Resource Requests 1.5.1. Submit a Resource Request
1.5.1.1. Add standardized FEMA resource typing 1.5.2. Edit/Update a Resource Request 1.5.3. Delete a Resource Request
1.5.4. Check the status of a Resource Request 1.5.5. Dynamic status updates
1.6. Significant Events 1.6.1. Post a Significant Event 1.6.2. Edit a Significant Event 1.6.3. Delete a Significant Event 1.6.4. Dynamic status updates
1.7. Position Log 1.7.1. Post a Position Log 1.7.2. Edit a Position Log 1.7.3. Delete a Position Log
1.8. Logistics 1.8.1. Acquire a Contract Resource
1.8.1.1. In-house 1.8.1.2. Out-house
1.8.1.2.1. Log to EM Constellation 1.8.1.3. Dynamic status updates
1.8.2. Approve a Resource Request 1.8.2.1. Update a Resource Request
1.8.2.1.1. Change the status of resource request
1.8.2.1.2. Add notes section to resource request updates 1.9. Planning
1.9.1. Create Incident Action Plans 1.9.2. Edit Incident Action Plans 1.9.3. Delete Incident Action Plans
1.10. Disaster Assistant
1.10.1. Ask a Question to the Disaster Assistant
42
1.10.2. Update disaster assistant? 1.11. Dashboards
1.11.1. Lives Saved, Injured, Deceased 1.11.1.1. Add dashboard data 1.11.1.2. Update dashboard data 1.11.1.3. Delete dashboard data
1.11.2. Cost to county 1.11.2.1. Add dashboard data 1.11.2.2. Update dashboard data 1.11.2.3. Delete dashboard data
1.11.3. View a dashboard 1.12. Injects
1.12.1. Acknowledge inject 1.12.2. Clarify an inject 1.12.3. Respond to Injects 1.12.4. Review Received Injects 1.12.5. Log injects
1.13. Chat 1.13.1. Initiate Chat 1.13.2. End Chat 1.13.3. Receive Chat 1.13.4. Accept Chat 1.13.5. Reject Chat
1.14. Logout 1.14.1. Release trainee role
1.15. Create Help files 1.15.1. Find/use automatic help file creator
1.16. Logging 1.16.1. Log chats 1.16.2. Log significant events 1.16.3. Log user actions during the exercise
1.16.3.1. log user response to injects 1.16.4. log position logs
1.17. Loose Ends 1.17.1. Add validation controls to interfaces 1.17.2. Update Chat program?
1.18. Logout 1.18.1. Automatic logout if time expires
1.18.2. Automatic logout if user closes windows without logging out
2. Exercise Developer 2.1. Login
43
2.1.1. Main browser login message 2.2. User Tutorial 2.3. Handbook Developer
2.3.1. Update the handbook developer 2.3.2 Add figures to handbook developer
2.4. Starting Status 2.4.1. Create starting status
2.4.1.1. Update starting status 2.4.1.1.1. Text
2.4.1.1.1.1. Insert text 2.4.1.1.1.2. Update text
2.4.1.1.2. Figures 2.4.1.1.2.1. Insert figure 2.4.1.1.2.2. Change figure 2.4.1.1.2.3. Delete figure
2.4.1.2. Create multiple starting status reports 2.5. Target Capabilities
2.5.1. Target Capabilities 2.5.1.1. Add target capabilities to script
2.5.1.1.1. Add new target capability
2.5.1.1.2. Add target capability from database 2.5.1.2. Edit target capabilities
2.5.1.3. Delete target capabilities from script 2.5.2. Target Capability Metrics
2.5.2.1. Add target capability metrics to script
2.5.2.1.1. Add new target capability metric
2.5.2.1.2. Add target capability metric from database 2.5.2.2. Edit target capability metrics
2.5.2.3. Delete target capability metrics from script 2.5.3. Exercise Objectives
2.5.3.1. Create exercise objectives 2.5.3.2. Add exercise objectives to script 2.5.3.3. Delete exercise objectives to script
2.5.4. Create exercise handouts for evaluators 2.6. Scripting
44
2.6.1. Create a Script 2.6.2. Edit Script
2.6.2.1. Injects 2.6.2.1.1. Add inject from Database 2.6.2.1.2. Add New Inject
2.6.2.1.3. Delete an inject from the script 2.6.2.1.4. Edit an inject
2.6.2.1.5. Move Injects Around Ad-hocly 2.6.3. Delete Script 2.6.4. Import/Upload Script 2.6.5. Export Script 2.6.6. Archive Script
2.6.6.1. View Archived script? 2.7. Database controls
2.7.1. Clear the logs for the script 2.7.2. Reset the logs for the script 2.7.3. Clear the logs for the player
2.8. Exercise Controller 2.8.1. User Tutorial 2.8.2. Control the Exercise
2.8.2.1. Start Exercise 2.8.2.2. Pause Exercise 2.8.2.3. Terminate Exercise 2.8.2.4. Next Block 2.8.2.5. Fast Time 2.8.2.6. Move Injects Around Ad-hocly
2.8.3. Player Reports 2.8.3.1. View Player Reports 2.8.3.2. Filter player reports 2.8.3.3. Sort player report elements 2.9.3.4 More detailed player reports
2.8.4. Logout 2.9. Loose Ends
2.9.1. Add validation controls to interfaces
3. Researcher 3.1. Login
3.1.1. Main browser login message 3.2. Choose Exercise Metrics
3.2.1. Percentage injects received but not responded to (missed)
45
3.2.2. Average inject response time (when does response time start and end?) 3.2.3. Correctly respond to injects
3.2.4. Response to injects within capability metrics 3.3. View Chat Logs
3.3.1. Analyze chat logs 3.3.1.1. Sort chat log elements 3.3.1.2. Filter chat log elements
3.4. View Position Logs 3.4.1. Analyze position logs
3.4.1.1. Sort position log elements 3.4.1.2. Filter position log elements
3.5. View Player Reports 3.5.1. Create more detailed player reports
3.5.1.1. Include expected user actions to injects 3.5.1.2. Better logging
3.5.2. View single player report 3.5.2.1. Analyze player reports
3.5.2.1.1. Sort player report elements
3.5.2.1.2. Filter player report elements
3.5.2.1.3. More detailed player reports 3.5.3. View multiple player reports
3.5.3.1. Analyze player reports 3.5.3.1.1. Sort player report elements
3.5.3.1.2. Filter player report elements 3.6. Logout 3.7. Loose Ends
3.7.1. Add validation controls to interfaces
4. Administrator 4.1. Create Console
4.1.1. Login 4.1.2. Create User Logins 4.1.3. Delete User Logins 4.1.4. Reset Locked Players
4.2. Manual Database Access 4.2.1. Modify tables and data in tables
4.3. Logout
46
4.4. Loose Ends 4.4.1. Add validation controls to interfaces
5. Database 5.1. Input validation
5.1.1. Add validation controls to interfaces
6. Documentation 6.1. Developer
6.1.1. Inline documentation (code) 6.2. System Documentation
6.2.1. Flow charts 6.2.1.1. System Overview 6.2.1.2. Trainee 6.2.1.3. Exercise Developer 6.2.1.4. Exercise Controller 6.2.1.5. Researcher 6.2.1.6. Administrator
6.3. User Manuals 6.3.1. Installation 6.3.2. System Overview 6.3.3. Trainee 6.3.4. Exercise Developer 6.3.5. Exercise Controller 6.3.6. Researcher 6.3.7. Administrator
7. System Improvements 7.1. System backup
7.1.1. Eclipse on personal computer 7.1.2. Servers (through svn)
7.2. General Loose Ends 7.2.1. Expand menu bars to fit screen 7.2.2. Adjust menu size to menu minimization
7.2.3. Salvation Army listed twice/Public Safety listed twice
7.2.4. Compatibility with different web browsers 7.2.4.1. Firefox 3.6.18 or greater 7.2.4.2. Internet Explorer 7.2.4.3. Safari
7.3. Review Reverse AJAX functionality 7.3.1. Chat program
47
7.3.2. Remote functionality 7.3.3. Scripting
7.4. XML Standards
7.4.1. Create standards for data transfer and storage 7.4.2. Switch database to XML database
7.5. Artificial Tutoring System 7.5.1. Expert System
7.6. Code Release on Source Forge 7.6.1. Scrub documents
7.6.1.1. Create own Database
7.6.1.2. Create passwords and URLs to database
7.6.1.3. Delete passwords and URLs to server 7.6.1.4. Use a virtual player?
7.7. Experiment with Cyberinfrastructure lab at Notre Dame 7.7.1. Write Journal Paper
8. Testing 8.1. System
8.1.1. System scalability 8.1.2. Server scalability
8.2. Web-browser compatibility 8.2.1. Firefox 3.6.18 or greater 8.2.2. Internet Explorer 8.2.3. Safari
8.3. Trainee Console 8.3.1. All elements working 8.3.2. Input validation 8.3.3. User interface design and functionality 8.3.4. Security
8.4. Exercise Developer 8.4.1. All elements 8.4.2. Input validation 8.4.3. User interface design and functionality 8.4.4. Security
8.5. Exercise Controller 8.5.1. All elements 8.5.2. Input validation 8.5.3. User interface design and functionality 8.5.4. Security
48
8.6. Researcher 8.6.1. All elements 8.6.2. Input validation 8.6.3. User interface design and functionality 8.6.4. Security
8.7. Administrator 8.7.1. All elements 8.7.2. Input validation 8.7.3. User interface design and functionality 8.7.4. Security
8.8. Database 8.8.1. Input validation 8.8.2. Scalability 8.8.3. Security
49
21 Software Call Graph
22 Testing
22.1 Manual Testing
22.2 Automated Testing
22.3 Automated Testing Tools • Selenium • Twill
50
• Watir • Sahi • Usabilla • Loop11
22.4 Master Test Plan (see Appendix A: Master Test Plan)
23 Future Work In the future, we would like to enhance system capabilities. We also aim to release the project in open source and/or integrate Ensayo into other applicable projects (e.g. Sahana, WebEOC) We may want to break up the controller from the exercise developer. Bugs, better testing.
51
Appendix A: Master Test Plan
52
08 Fall
Master Test Plan
Virtual Emergency Operations Center
53
vEOC
MASTER TEST PLAN
Version 1
May 2010
54
Table of Contents
2. Functionality Testing ................................................................................................................. 56 2.1. Testing all the “Links”: ............................................................................................................................................................. 56 2.2. Testing of the forms on the web pages: ............................................................................................................................ 56 2.3. Cookie Testing: ............................................................................................................................................................................ 57 2.4. Validation (HTML/CSS/PHP): .............................................................................................................................................. 57 2.5. Validation Checklists Tables: ................................................................................................................................................ 57 HTML Validation ...................................................................................................................................................................................... 57 Image Validation ..................................................................................................................................................................................... 58 Font Validation ......................................................................................................................................................................................... 59 Printer Friendly Validation ................................................................................................................................................................. 59 Style Sheet Validation ............................................................................................................................................................................ 60 Table Validation ....................................................................................................................................................................................... 60 Style Guide and Template Adherence ............................................................................................................................................. 61 Plug in Validation .................................................................................................................................................................................... 61 Test Station Validations ........................................................................................................................................................................ 61 Packaged Application Validation ..................................................................................................................................................... 62 Links and URL Validation ..................................................................................................................................................................... 62 Redirect Validation ................................................................................................................................................................................. 63 Bookmark/Favorite Validation ......................................................................................................................................................... 63 Using The Browsers That Most Of The Clients Have ................................................................................................................. 64 Website Organization Testing Checklist ........................................................................................................................................ 64 Web Site Map Validation ...................................................................................................................................................................... 65 Search Engine Testing-‐Validate Accuracy And Performance Under Normal And Stress Loads ........................... 65 Link–Checking Tools Testing Checklist .......................................................................................................................................... 66 Validating Forms On A Web Site ....................................................................................................................................................... 67
2.5.1. Client vs. Server Side Validation: ..................................................................................................................................... 68 Validating Data on a Form .................................................................................................................................................................. 68 Validating DHTML Pages ..................................................................................................................................................................... 69 Validating Pop-‐ups .................................................................................................................................................................................. 69 Streaming Content Checklist ............................................................................................................................................................... 69 Common Gateway Interface (CGI) Script Validation ............................................................................................................... 70 Data Integrity Validation ..................................................................................................................................................................... 70
2.5.2. Server Side Validation: ......................................................................................................................................................... 71 Server-‐Side Includes Validation ......................................................................................................................................................... 71 Dynamic Server Page Validation ...................................................................................................................................................... 71 Cookie Validation ..................................................................................................................................................................................... 72 Maintaining a Session ............................................................................................................................................................................ 73 Managing Concurrent Users ............................................................................................................................................................... 74 Site Level Usability Validation ........................................................................................................................................................... 75 Page Level Usability Validation ......................................................................................................................................................... 75 Readability Validation ........................................................................................................................................................................... 77 Language Validation .............................................................................................................................................................................. 77 Color Validation ....................................................................................................................................................................................... 79 Screen Size And Pixel Resolution Validation ................................................................................................................................ 79 Accessibility Validation ......................................................................................................................................................................... 79 Privacy Validation ................................................................................................................................................................................... 80 Acceptable Response Times ................................................................................................................................................................. 80 Average Response Time Under Normal Conditions .................................................................................................................. 81
55
Determining Stress Points .................................................................................................................................................................... 81 System Approaches Maximum Capacity ........................................................................................................................................ 82 Performance During Spikes ................................................................................................................................................................. 83
2.6. Database Testing: ....................................................................................................................................................................... 84
3. Usability Testing ........................................................................................................................ 84 3.1. Objective: ....................................................................................................................................................................................... 84 3.2. Basic Usability: ............................................................................................................................................................................ 84 3.3. Methodology: ............................................................................................................................................................................... 85 3.3.1. Participants: .............................................................................................................................................................................. 85 3.3.2. Training: ..................................................................................................................................................................................... 85 3.3.3. Procedure: ................................................................................................................................................................................. 85 3.3.4. Roles: ........................................................................................................................................................................................... 87 3.3.4.1. Trainer: ................................................................................................................................................................................... 87 3.3.4.2. Facilitator: .............................................................................................................................................................................. 87 3.3.4.3. Data Logger: .......................................................................................................................................................................... 87 3.3.4.4. Test Observers: .................................................................................................................................................................... 87 3.3.4.5. Test Participants: ................................................................................................................................................................ 87 3.4. Usability Metrics ......................................................................................................................................................................... 87 3.4.1. Scenario Completion ............................................................................................................................................................. 87 3.4.2. Critical Errors ........................................................................................................................................................................... 88 3.4.3. Non-‐critical Errors ................................................................................................................................................................. 88 3.4.4. Subjective Evaluations ......................................................................................................................................................... 88 3.4.5. Scenario Completion Time (time on task) ................................................................................................................... 88 3.5. Usability Goals ............................................................................................................................................................................. 88 3.6. Completion Rate ......................................................................................................................................................................... 88 3.7. Error-‐free rate ............................................................................................................................................................................. 89 3.8. Time on Task (TOT) .................................................................................................................................................................. 89 3.9. Subjective Measures ................................................................................................................................................................. 89 3.9.1. Problem Severity .................................................................................................................................................................... 89 3.9.2. Impact .......................................................................................................................................................................................... 89 3.9.3. Frequency ................................................................................................................................................................................... 90 3.9.4. Problem Severity Classification ....................................................................................................................................... 90
4. Compatibility Testing ................................................................................................................. 91 4.1. Browser Compatibility: ........................................................................................................................................................... 91 4.2. OS Compatibility: ........................................................................................................................................................................ 91 4.3. Mobile Browsing: ....................................................................................................................................................................... 91 4.4. Printing Options: ........................................................................................................................................................................ 91
5. Performance Testing: ................................................................................................................ 91 5.1. Web Load Testing: ..................................................................................................................................................................... 92 5.2. Stress Testing: ............................................................................................................................................................................. 92 6. Security Testing ......................................................................................................................... 92
56
1. Functionality Testing
Performed for testing of: all the links in web pages, checking the database
connections, forms used in the web pages for submitting or getting information from
user & Cookie testing.
1.1. Testing all the “Links”:
• Test the outgoing links from all the pages from specific domain under test.
• Test all internal links.
• Test links jumping on the same pages.
• Test links used to send the email to admin or other users from web pages.
• Test to check if there are any orphan pages.
• Lastly in link checking, check for broken links in all above-mentioned links.
1.2. Testing of the forms on the web pages: Forms are the essential and integral part of any web site. Forms are used to get information from users and to keep interaction with them. The following should be checked on the forms:
• Check all the validations on each field. • Check for the default values of fields. • Wrong inputs to the fields in the forms. • Options to create forms if any, form delete, view or modify the forms. • Check that no empty forms are created. • There are different field validations like email-id’s, user financial information,
date, etc All the above validations should be checked in a manual or an automated way.
57
1.3. Cookie Testing: Cookies are small files stored on user machine that are basically used to maintain the sessions such as the ‘login sessions’.
• Test the application by enabling or disabling the cookies in your browser options • Test if the cookies are encrypted before writing to user machine • During the test for session cookies (i.e. cookies expire after the sessions ends)
check for login sessions and user stats after session end • Check effect on application security by deleting the cookies
1.4. Validation (HTML/CSS/PHP):
• HTML/CSS validation is very important for optimizing the website for search engines.
• The site has full and correct Doctype • The site uses character set • The site uses valid XHTML • The site uses valid CSS • The site has no unnecessary ids or classes • The site uses well structured code • The site has no broken links • The site has no JavaScript errors
1.5. Validation Checklists Tables:
HTML Validation Pass Fail Description
Any exceptions to W3C HTML V4.0 standards have been approved and documented
HTML code is W3C HTML V4.0 compliant (barring any approved exceptions)
Web page renders correctly when viewed with opera 5.0 browser
Comments and change control logs are not included in the HTML sent to the client
58
Image Validation Pass Fail Description
The image adds value to the website
If the image is animated it links to the appropriate page
The image is stored in the most appropriate format (e.g. .GIF files for buttons and .JPG files for photos)
If a GIF file, the image size is a multiple of 8 pixels
The visual size of the image is appropriate for the size of the viewable screen (it does not occupy too much or too little of the screen real estate)
The physical file size of the image is as small as possible without compromising the quality of the image i.e. the file was saved using the optimum compression ratio.
An appropriate ALT Tag is included with this image.
The WIDTH & HIGHT (expressed as page % and not absolute pixel sizes) tags have been specified for this image
The image is not copyrighted or trademarked by someone else
The total size of the image on the page does not exceed 50kbytes
There is not more than one animated image on this page
Photographic images aside (e.g. JPG images) no more than 256 colors are used on this web page
Any image maps used are client side (as opposed to server side)
59
Font Validation
Pass Fail Description The font is proportional The primary font is Verdana, with
Ariel and Sans-Serif specified as alternates
The browsers base font size is not altered
Only relative font sizes are used (e.g. small medium and large) rather than specific point sizes
No more than 3 font sizes are used on the web page
Symbol fonts are used only when absolutely necessary
If symbol fonts are used they are properly mapped to the private area of the developers Unicode
Browser default colors are not overridden
Printer Friendly Validation Pass Fail Description The test on the web page is
formatted correctly when printed via a 72 dpi printer using letter and A4 paper sizes
The content of the web page is clearly readable when printed with a black and white printer
The content of the web page is clearly readable when printed with a colored printer
The background of the webpage is white
Only dark colors are used for the text on the web page
60
Style Sheet Validation Pass Fail Description The style sheet is W3C level 1
compliant The style sheet is correctly
interpreted by the 4X generation of the web browsers
The style sheet complies with printer friendly standards
The style sheet complies with the font standards
The style sheet is defined as an external CSS style
Web pages do not modify the style sheet dynamically
Web pages that use the style sheet provide acceptable rendering when viewed by the browsers that do not support CSS or have CSS turned off by the client
Table Validation Pass Fail Description There are no unwanted spaces or
carrier returns in the table No cell is overpopulated with too
much verbiage Every cell in the table is
populated (i.e. no null values) as some browsers collapse empty cells. Extra scrutiny should be applied if the information is imported from a database
dynamically The WIDTH & HIGHT Tags were
specified for all cells using screen % instead of absolute
pixels wherever possible
61
Style Guide and Template Adherence Pass Fail Description
The web page follows (except where documented/ approved) the style guidelines documented
The web page was based on the most appropriate web page template
Plug in Validation Pass Fail Description The website (after requesting the
clients permission) lists the plug ins and the versions to view all the content on the site
The web site able to detect whether or not the required plug-ins are installed in the client side
In the event that the web site is unable to accurately determine whether or not a plug- in is installed or not, the website contains an area that tells the user how to proceed
Test Station Validations Pass Fail Description
Different versions of the same brand of browsers are installed in different instances of an operating system
Only general release software is used. No OEM.SP or beta versions are used with the exception of any required Y2K patches that are necessary for this to work post Y2K
All of the installations use the installation defaults for directory names, cache sizes, fonts plug-ins etc.
62
Packaged Application Validation Pass Fail Description Product documentation the
exact order in which the components should be installed and the configuration settings that are required or recommended
Product documentation explains how to uninstall the product cleanly
Product documentation adequately describes when and how the data files or database should be reorganized
Automatic updates install and operate correctly on all of the supported platforms
Automatic updates install and operate correctly when other application have been added /removed before and after the update is performed
Links and URL Validation Pass Fail Description The link is not broken and goes to
the most appropriate location If the link is in an internal link it uses
all lower case characters If this link is an internal link it uses
relative addressing (i.e. it does not use an absolute address)
If this link is an internal link, it does not launch a new browser window unless it’s a help page
If this link is an external link, it does launch a new browser window
This link adds value to the website,
63
Links with little value add to the maintenance load (especially external links) and potentially make a webpage less usable
The browsers GO/HISTORY list is updated correctly after using this link. Some developers manipulate the browsers history and thereby degrade the website’s usability
When using the BACK button the, previously entered data is not lost
The link text does not wrap to two lines, this may confuse visitors into thinking that there are two links instead of one.
Redirect Validation Pass Fail Description
The default 400,401,402,403 and 404 –error pages have been developed and properly configured on the production Web server(s)
If the link is being redirected, it goes to the correct final destination and is not redirected
If a link points to a directory (instead of a specific web page) the link ends with a slash
Bookmark/Favorite Validation Pass Fail Description
Every web page has a bookmark that accurately reflects the contents of the webpage
No bookmark is longer than 32 characters, since browsers typically truncate the display of verbose descriptions
Each bookmark must start with
64
“VEOC-”
Using The Browsers That Most Of The Clients Have Pass Fail Description Pages using framesets are
displayed correctly Frames are not resizable Pages within the framesets can
be bookmarked The back button recalls the URL of
the last frame viewed The initial frameset is downloaded
in an acceptable period of time Pages using framesets can be
printed correctly or an alternate page is available for printing
Nested framesets (if used) have sufficient screen real estate assigned to each frame
All external links launch new browser windows (i.e. third party web sites are not embedded inside VEOC frame set)
Search engines can find all of the contents within the framesets
Website Organization Testing Checklist Pass Fail Description “Core” web pages can be
located within 4 clicks All the web pages in the website
can be found by casually browsing the website (i.e. no need to resort to a site map or a search engine)
Information on the site can be found using the search strategies that a visitor might consider
The web site does not contain any orphaned files (i.e. files that
65
cannot be reached by following any path from the home page)
Web Site Map Validation Pass Fail Description All “core” web pages can be
found using the site map Only “core” web pages are
located on the site map Web pages are listed in an
appropriate hierarchy Links are all functional and go to
the correct pages Search Engine Testing-Validate Accuracy And Performance Under Normal And Stress
Loads Pass Fail Description
The first set of results is returned within 5 seconds (excluding internet transmission times)
The result is sorted appropriately (e.g. alphabetically or by % likelihood)
The search engine functions correctly when a user enters common words that are likely to generate a huge no. of hits such as “a”,” the ” or “VEOC”
The search engine functions correctly when the user enters non-existent words that are unlikely to generate any valid answers such as “hggfkh”, “hjjgj” or null requests
The search engine ignores the source code used to build a web page and only indexes, the content of the web page (e.g. requesting information on “JavaScript” will only return
66
documents that reference JavaScript, not all of the web pages that use JavaScript in their source code)
The search engine does not index sensitive words such as “secret” or “fraud” etc
The search engine functions correctly when you enter a search string with a maximum number of characters plus one
The search engine functions correctly when you enter multiple word requests with or without the Boolean operators “and”, “or”, “not”, “+” or “–“
The search engine functions correctly when you enter one or more wildcards
If fuzzy login is enabled, the search engine offers alternate suggestions for zero hit requests based on searches using a spellchecked version of the initial search string
Link–Checking Tools Testing Checklist Pass Fail Description External links can be checked but
(optionally) cannot be scanned any further
When encountering a recursive loop, the tool does not go into a death spiral
Tools do not ignore duplicate links The tool is able to handle
dynamic links The tool is able to handle
framesets The tool is able to handle cookies
(session/persistent)
67
The tool can handle pages that require user inputs (e.g. forms)
The tool facilitates identifying suspiciously large or small pages
The tool specifies identifying absolute links
The tool facilitates identifying the pages that are too deep
Validating Forms On A Web Site Pass Fail Description
All data entry fields have HTML size attribute set correctly (size is used to specify the width of the field)
All the data entry fields have the HTML MAXLENGTH SET correctly (max length of characters a user can enter)
If radio controls are used a default is always selected
All required fields use a visual cue to indicate to the user that the field is mandatory
If a form uses a drop down data entry field (control) the options are sorted appropriately and the fields is wide enough to display all of the options
Data is not lost when the user clicks the browsers back button (and subsequently forward) buttons through a series of forms
Data is not lost when the user clicks the browsers forward button (and subsequently back) buttons midway through a series of forms
Data is not lost when the user clicks the GO/HISTORY buttons to revisit previous forms
Data is not lost when the user clicks the bookmark or favorite
68
midway through a series of forms Data is not lost when the user
clicks the browser reload button midway through a series of forms
Data is not lost when the user resizes the browser window
Duplicate data is not added to the database when a user presses any combination of the forward, back, go/history, bookmark/favorite, reload, resize buttons midway through a series of forms
The browser places the cursor on the most appropriate field/control where the form is first viewed
Using the browsers tab key allows the client to tab through the input fields on the form in a top to bottom, left to right order
If the form data is send back to the web server using the HTTP GET command, the data is not truncated
1.5.1. Client vs. Server Side Validation:
Validating Data on a Form Pass Fail Description All data entry fields are checked
for invalid data. An appropriate error message is displayed if the data is found to be invalid
All validations are performed (error messages displayed) in a top-down, left-right fashion
All required fields are checked on the client side
Whenever possible, all fields co-dependencies are checked on the client side
All basic data checks are
69
performed on the client side All client-side checks are
rechecked on the server-side
Validating DHTML Pages Pass Fail Description DHTML is appropriate for most of
the user browsers All the DHTML code conforms to
the W3C DHTML standard The pages are displayed and
viewed correctly in different browsers
Validating Pop-ups Pass Fail Descriptions Website is able to detect the
browser that has disabled or (does not support) JavaScript /java/ActiveX and provides the user with an appropriate message
The pop-up follows the web GUI standard
The pop-up is not too large for the parent window and its initial screen positioning is appropriate
Streaming Content Checklist Pass Fail Description The streaming quote server and
the network is able to handle the expected demand for this service
Clients are able to suspend/restart this service without needing to unsubscribe / re-subscribe
Clients are able to adjust the frequency of updates to cater the different client side
70
bandwidths
Common Gateway Interface (CGI) Script Validation Pass Fail Description The CGI script is able to parse
input parameters containing quotation marks, carriage returns, ampersand symbols, dollar signs, question marks and other control characters
The CGI script is robust enough to handle, missing and out of range parameters
The CGI script is robust enough to handle null values being returned from the database
The CGI script is robust enough to handle “no record found ” code being returned by the database
The CGI script is robust enough to handle a “duplicate record inserted ” code being returned by the database
The CGI is robust enough to handle multiple records being returned by the database
The CGI script is robust enough to handle a database timeout code being returned by the database
The web server has sufficient resources to handle the expected number of the CGI scripts that are likely to be initiated
Data Integrity Validation Pass Fail Description A new record is inserted into the
database A new record can be accurately
71
read from the database The record is accurately updated
into the database A record is completely deleted
from the database 1.5.2. Server Side Validation:
Server-Side Includes Validation Pass Fail Description All SSI and XSSI selection criteria
are accurately documented and each include file contains a “start of file” and “end of file ” comment
No JSSI files are used The appropriate content is
displayed and formatted correctly for each of the possible selection criteria
No “include” file references another “include ” file. While technically possible, this programming style can be difficult to debug and can also impact performance
Dynamic Server Page Validation Pass Fail Description The dynamically generated page
is not a candidate for being replaced by one or more static pages
Developers used a single language for all scripts within all dynamically generated web page
No “template” file references another “template” file. While technically possible, this programming style can be
72
difficult to debug and can also impact performance
All DSP templates have been inspected by at least one senior developer who was not the author of the template
All high frequency pages have been generated and manually tested
All high risk pages have been generated and manually tested
Cookie Validation Pass Fail Description
When cookies are: Disabled before accessing the
site before, either one of the tow things happens:
§ The site works correctly § The site issues warning
messages telling the visitor that cookies turned on can access the site.
Disabled midway through a transaction, the site is able to detect the situation and handle it gracefully
Deleted mid way through a transaction
When cookie is edited and some parameters are: Added, the site detects the
situation and handles it gracefully Deleted, the site detects the
situation and handles it gracefully Swapped, the site detects the
situation and handles it gracefully Set to null, the site detects the
situation and handles it gracefully Some parameters are edited and
set to invalid values, the site
73
detects the situation and handles it gracefully
Other Validation Tests include the following When the clients PC memory or
disk cache is cleared midway through the transaction, the site detects the situation and handles it gracefully. Sessions cookies are stored in the memory and typically don’t get saved to the hard disk. Persistent cookies may need to be deleted manually
When control characters or special operating system commands are added to a cookie, the site detects the situation and handles it gracefully
When multiple entries for a website are added to the browser’s cookies.txt file, the site detects the situation and handles it gracefully
When the user identification field in the cookie is changed midway through a transaction, the site detects the situation and handles it gracefully. Consider replacing the regular user-id account with values such as admin, test, super user or guest
Maintaining a Session Pass Fail Description
The web application is capable of maintaining a single session through multiple browsers running on the same client
The web application is capable of simultaneously accessing the same account through multiple clients
Adequate database locking
74
capabilities have been documented in the specification and have been properly implemented
The web application time/date stamps transactions using the clock on the web server, not the clock on the client
The web application is able to handle a user disabling cookies (session and/or persistent) midway through a transaction
The web application is able to handle a user clearing the cache (disk or memory) midway through a session
The web application is able to handle a user disabling JavaScript and/or VBScript midway through a transaction
The web application is able to handle a user disabling the java applets and/or ActiveX controls midway through a session
The web application is able to handle a user deleting the query portion of the website’s URL midway through a session
The load balancer (if used) is able to maintain a session
Managing Concurrent Users Pass Fail Description
Server memory is freed when a user completes a session or transaction
Network connections are closed when a user completes a session or a transaction
Disk space is freed when a user completes a session or a
75
transaction
Site Level Usability Validation Pass Fail Description
There are no framesets in the website. Framesets can be difficult to navigate, take too long to download and cause print problems
Content is structured in terms of simple hierarchies
The user mental mode is consistent across the entire website. Webpage controls behavior and aesthetics remain consistent
The amount of time (based on the number of pages) needed to complete a multi page task is perceived by the user
Page Level Usability Validation Pass Fail Description
Graphics and other bandwidth intensive elements are kept to a minimum
Key functions such as search and help buttons are easy to find
There are no competing options that might confuse or cause him or her to make an error
The content is current and the previous content is available via an archive
Related information on the same page has been grouped, thereby minimizing eye movement
Critical information has not been placed on the lower portion of the webpage. If the position of
76
this information requires the user to scroll down, most visitors are unlikely to ever read it
Content makeup 50% to 80% of the screen real estate
Vertical scrolling has been kept to a minimum, especially on navigational pages
When viewed via the anticipated clients hardware/software, the page fits without the need of a horizontal scrollbar
When printed via the anticipated clients, the page prints without being truncated
Name and logo of the emergency center is visible on the page
Browser (e.g. HTML, JavaScript etc) features that have been available for less than 1year have not been used. A significant number of users use browsers that are less than 1 year old
No popup that open new browser windows are launched
All links ad graphics have a TITLE or ALT tag defined. Decorative images (e.g. white space or formatting borders) should have a blank tag defined
URL’s are all lower case There are no areas of large bright
colors No more than 4 colors (ignoring
graphics) have been used on the page
The page background color is not dark
All controls have been outlined in black for clarity, unless they are exceptionally small i.e. less than 16x16 pixels
77
Browsers default link colors have not been overridden or altered
Page object size have been specified as % of available screen, rather than a fixed pixel size
Text has not been placed inside graphic files. This approach takes longer to download, can be more work to translate for multilingual websites and may have quality issues with low resolution displays such as WebTV
If using CSS, the web page’s presentation is still turned off or not available
Three (3) alternative fonts (in the same order) have been specified for all text
Font sizes have been specified as relative sizes (e.g. heading 1) rather than as absolute sizes
Readability Validation Pass Fail Description
A random selection of passages all scored 16 when measured using the fry algorithm
A random selection of passages all scored less than 25 when measured using the fry algorithm
Language Validation Pass Fail Description
No presentation problems occur when page is displayed in English
No local slangs are used anywhere on the page
No offensive terms (when translated) are used
78
Character sets for foreign languages are displayed correctly
Foreign currencies are displayed correctly and converted if necessary
Date and time formats are displayed correctly for the target countries (e.g. 20.01.00 for European versus 01/20/00 for U.S)
Address formats are displayed correctly
Translated words have been placed in the correct order on each webpage, unlike American and European languages that read from left to right, some languages read from top to bottom and others read from right to left
Each webpage can be viewed using a browser without any special modifications (e.g. the user doesn’t have to install any non-standard fonts)
Alphabetic lists are sorted correctly for each language
Supporting documentation has been translated to English (e.g. help systems, error messages order manuals audio and video clips)
The colors and symbols used on this website has a consistent meaning across all of the required languages (e.g. red implies danger in North America and happiness in China)
Databases are setup to allow non standard alphabets (e.g. double byte characters)
79
Color Validation
Pass Fail Description The colors used on this website
are friendly to color blind users The colors used on the website
are accurately displayed when using the minimum expected number of colors on a client
All colors used on this website are browser safe
Screen Size And Pixel Resolution Validation Pass Fail Description
The website has been designed to fit the requirements of the lowest likely screen size and pixel resolution used by a client. If the client’s capabilities vary significantly then multiple websites have been developed to accommodate each client.
The appearance of each web page has been tested using different browsers/versions to ensure that the page is displayed as intended (e.g. no horizontal scroll bars)
Accessibility Validation Pass Fail Description
ALT tags are included with all images and TITLE tags are included with all links
Color should be used as a sole means of conveying information
Web pages that make use of style sheets should still be usable in
80
browsers that do not support or have turned off this functionality
Techniques that cause screen flicker are not used
If image maps are used an alternate list of corresponding lists is provided
If applets or scripts are used, the web page is still usable if the functionality Is turned off or not supported by the browser
If video files are used sub titles are also available
If audio files are used, transcripts are also provided. In addition to viewers with hearing difficulties, some viewers may not have speakers installed.
The web page is understandable when heard through an audio only browser
The web page is understandable when viewed through a text only browser
Multiple key combinations can be entered sequentially or are mapped to a single key
Privacy Validation Pass Fail Description
The website has a legally valid privacy statement posted
The website is approved by at least one external auditor
The third party seal of approval is accurately displayed alongside the privacy statement
Acceptable Response Times Time in Seconds Description of action
81
Less than 0.1 seconds This is the limit for having the user feel that the system is reacting instantaneously. That is no feedback regarding the time delay is necessary other than displaying the results. Example action includes button clicks or client side dropdown menu’s
Less than 1.0 second This is the limit for the users flow of thought to be uninterrupted even though the user will notice a slight delay. Normally no feedback is necessary for delays between 0.1 and 1.0 seconds. The user may lose the feeling of operating directly on the data .Example actions include ,the initiation of page navigation or java applet execution
Less than 10 seconds This is the maximum amount of time that can lapse while keeping the users attention focused on the dialogue. Example actions include completion of page navigation
Average Response Time Under Normal Conditions Pass Fail Description
95% of the web pages download in less than 10 seconds when using a 28kbps modem from any location within the continental US
Orders are completed within 2 minutes of the user requests
Confirmations of the request actions are made within 30 seconds
Determining Stress Points Maximum Value Description
Determine the maximum number of requests/actions per second the website can handle
Determine the maximum number of session initiations per hour that the website can be expected to support
Determine the maximum number of concurrent users that the website can be
82
expected to support
System Approaches Maximum Capacity Pass Fail Description
At 80% Capacity Until the system returns to normal
operating conditions, new clients who try to log on will be given a message to try again later
Inactive clients will be given a warning message that they may be dropped from the system and not be permitted to log on again until conditions return to normal
Non critical services will be shut down in the order of least to most important
Pager or email notification of potential gridlock is sent to technical support personnel
At 90% Capacity Inactive clients will be logged off Backup websites will be activated Pager or email notification of
potential gridlock are resent to technical support personnel
At 100+% capacity The system does not allows any
new requests to be initiated The system does not reboots itself The system does not shut down
security services The system does not suspend any
transaction logging The system does not gridlock Hardware components do no
fuse or meltdown Page or email notification of
impending gridlock are sent to technical support
At any Capacity
83
The system maintains its functional integrity
Performance During Spikes Pass Fail Description
No user who were logged in to the website prior to the spike are dropped
Transactions/Requests/Actions that were started before the spike are still in progress and successfully completed after the spike
New users are able to login to the website during and after the spike
Security services remain active during the spike
84
1.6. Database Testing: • Check for data integrity and errors while you edit, delete, modify the forms or do
any database related functionality. • Check if all the database queries are executing correctly and data is retrieved
correctly and also updated correctly.
2. Usability Testing 2.1. Objective: The goals of usability testing include establishing a baseline of user performance, establishing and validating user performance measures, and identifying potential design concerns to be addressed in order to improve the efficiency, productivity, and end-user satisfaction. The usability test objectives are:
• To determine design inconsistencies and usability problem areas within the user interface and content areas. Potential sources of error may include:
o Navigation errors – failure to locate functions, excessive keystrokes to complete a function, failure to follow recommended screen flow.
o Presentation errors – failure to locate and properly act upon desired information in screens, selection errors due to labeling ambiguities.
o Control usage problems – improper toolbar or entry field usage. • Exercise the application or web site under controlled test conditions
with representative users. Data will be used to access whether usability goals regarding an effective, efficient, and well-received user interface have been achieved.
• Establish baseline user performance and user-satisfaction levels of the user interface for future usability evaluations.
2.2. Basic Usability:
• The site should have clear hierarchy • Headings clearly indicate the structure of the document • Navigation should be easy to understand • Navigation is consistent throughout the site • The site uses underlined links • The site uses consistent and appropriate language • The site has easy to find sitemap and contact page • The site has a search tool • The site has a link to home page on every page
85
• The site has clearly defined visited links 2.3. Methodology:
• Describe briefly the number of participants • The setting of the usability test sessions • Tools used to facilitate the participant's interaction with the application (ex:
browser) • The measures to be collected, such as demographic information, satisfaction
assessment, and suggestions for improvement 2.3.1. Participants:
• Thoroughly describe the number of participants expected, how they will be recruited, characteristics of their eligibility, and expected skills/knowledge.
• The participants' responsibilities will be to attempt to complete a set of
representative task scenarios presented to them in as efficient and timely a manner as possible, and to provide feedback regarding the usability and acceptability of the user interface. The participants will be directed to provide honest opinions regarding the usability of the application, and to participate in post-session subjective questionnaires and debriefing.
• Describe how the team will select test participants to meet stated
requirements. Explain if participants will have certain skills and/or background requirements, if they will be familiar with the evaluation tasks, or have experience with performing certain tasks.
2.3.2. Training:
• The participants will receive and overview of the usability test procedure, equipment and software
• The parts of the test environment or testing situation that may be nonfunctional 2.3.3. Procedure: [Lab Testing]
Participants will take part in the usability test at [Florida International University] in [Emergency Operation Center]. A [type of computer] with the Web site/Web application and supporting software will be used in a typical office environment. The facilitator seated in the same office will monitor the participant’s interaction with the Web site/Web application. Note takers and data logger(s) will monitor the sessions in observation room, connected by video camera feed. The test sessions will be videotaped.
86
The facilitator will brief the participants on the Web site/Web application and instruct the participant that they are evaluating the application, rather than the facilitator evaluating the participant. Participants will sign an informed consent that acknowledges: the participation is voluntary, that participation can cease at any time, and that the session will be videotaped but their privacy of identification will be safeguarded. The facilitator will ask the participant if they have any questions.
Participants will complete a pretest demographic and background information questionnaire. The facilitator will explain that the amount of time taken to complete the test task will be measured and that exploratory behavior outside the task flow should not occur until after task completion. At the start of each task, the participant will read aloud the task description from the printed copy and begin the task. Time-on-task measurement begins when the participant starts the task.
The facilitator will instruct the participant to ‘think aloud’ so that a verbal record exists of their interaction with the Web site/Web application. The facilitator will observe and enter user behavior, user comments, and system actions in the data logging application [describe how these metrics will be recorded if a data logging application is not used.]
After each task, the participant will complete the post-task questionnaire and elaborate on the task session with the facilitator. After all task scenarios are attempted, the participant will complete the post-test satisfaction questionnaire.
[For Remote Testing] Participants will take part in the usability test via remote screen-sharing technology. The participant will be seated at their workstation in their work environment. Verbal communication will be supported via telephone.
The facilitator will brief the participant and instruct that he or she is evaluating the Web site/Web application, rather than the facilitator evaluating the participant. Participants will complete a pretest demographic and background information questionnaire. Sessions will begin when the facilitator answers all participant questions. The facilitator will inform the participant that time-on-task will be measured and that exploratory behavior outside the task flow should not occur until after task completion.
The facilitator will instruct the participant to read aloud the task description from the printed copy and begin the task. Time-on-task measure will begin. The facilitator will encourage the participants to ‘think aloud’ and that a verbal record will exist of the task-system interaction. The facilitator will observe and enter user behavior and comments, and system interaction in a data logging application.
87
After each task, the participant will complete the post-task questionnaire and elaborate on the task session. After all tasks have been attempted, the participant will complete a post-test satisfaction questionnaire.
2.3.4. Roles: The roles involved in a usability test are as follows. An individual may play multiple roles and tests may not require all roles.
2.3.4.1. Trainer: • Provide training overview prior to usability testing
2.3.4.2. Facilitator: • Provides overview of study to participants • Defines usability and purpose of usability testing to participants • Assists in conduct of participant and observer debriefing sessions • Responds to participant's requests for assistance
2.3.4.3. Data Logger: • Records participant’s actions and comments
2.3.4.4. Test Observers: • Silent observer • Assists the data logger in identifying problems, concerns, coding bugs, and
procedural errors • Serve as note takers
2.3.4.5. Test Participants: • Provides overview of study to participants • Defines usability and purpose of usability testing to participants • Assists in conduct of participant and observer debriefing sessions • Responds to participant's requests for assistance
2.4. Usability Metrics Usability metrics refers to user performance measured against specific performance goals necessary to satisfy usability requirements. Scenario completion success rates, adherence to dialog scripts, error rates, and subjective evaluations will be used. Time-to-completion of scenarios will also be collected.
2.4.1. Scenario Completion Each scenario will require, or request, that the participant obtains or inputs specific data that would be used in course of a typical task. The scenario is completed when the participant indicates the scenario's goal has been obtained (whether successfully
88
or unsuccessfully) or the participant requests and receives sufficient guidance as to warrant scoring the scenario as a critical error.
2.4.2. Critical Errors Critical errors are deviations at completion from the targets of the scenario. Obtaining or otherwise reporting of the wrong data value due to participant workflow is a critical error. Participants may or may not be aware that the task goal is incorrect or incomplete. Independent completion of the scenario is a universal goal; help obtained from the other usability test roles is cause to score the scenario a critical error. Critical errors can also be assigned when the participant initiates (or attempts to initiate) and action that will result in the goal state becoming unobtainable. In general, critical errors are unresolved errors during the process of completing the task or errors that produce an incorrect outcome. 2.4.3. Non-critical Errors Non-critical errors are errors that are recovered from by the participant or, if not detected, do not result in processing problems or unexpected results. Although non-critical errors can be undetected by the participant, when they are detected they are generally frustrating to the participant.
These errors may be procedural, in which the participant does not complete a scenario in the most optimal means (e.g., excessive steps and keystrokes). These errors may also be errors of confusion (ex., initially selecting the wrong function, using a user-interface control incorrectly such as attempting to edit an un-editable field).
Noncritical errors can always be recovered from during the process of completing the scenario. Exploratory behavior, such as opening the wrong menu while searching for a function, [will, will not (edit Procedure)] is coded as a non-critical error. 2.4.4. Subjective Evaluations Subjective evaluations regarding ease of use and satisfaction will be collected via questionnaires, and during debriefing at the conclusion of the session. The questionnaires will utilize free-form responses and rating scales.
2.4.5. Scenario Completion Time (time on task) The time to complete each scenario, not including subjective evaluation durations, will be recorded. 2.5. Usability Goals The usability goals are as follows: 2.6. Completion Rate
89
Completion rate is the percentage of test participants who successfully complete the task without critical errors. A critical error is defined as an error that results in an incorrect or incomplete outcome. In other words, the completion rate represents the percentage of participants who, when they are finished with the specified task, have an "output" that is correct. Note: If a participant requires assistance in order to achieve a correct output then the task will be scored as a critical error and the overall completion rate for the task will be affected.
A completion rate of [100%/enter completion rate] is the goal for each task in this usability test. 2.7. Error-free rate Error-free rate is the percentage of test participants who complete the task without any errors (critical or non-critical errors). A non-critical error is an error that would not have an impact on the final output of the task but would result in the task being completed less efficiently.
An error-free rate of [80%/enter error-free rate] is the goal for each task in this usability test. 2.8. Time on Task (TOT) The time to complete a scenario is referred to as "time on task". It is measured from the time the person begins the scenario to the time he/she signals completion.
2.9. Subjective Measures Subjective opinions about specific tasks, time to perform each task, features, and functionality will be surveyed. At the end of the test, participants will rate their satisfaction with the overall system. Combined with the interview/debriefing session, these data are used to assess attitudes of the participants. 2.9.1. Problem Severity To prioritize recommendations, a method of problem severity classification will be used in the analysis of the data collected during evaluation activities. The approach treats problem severity as a combination of two factors - the impact of the problem and the frequency of users experiencing the problem during the evaluation.
2.9.2. Impact Impact is the ranking of the consequences of the problem by defining the level of impact that the problem has on successful task completion. There are three levels of impact:
• High - prevents the user from completing the task (critical error) • Moderate - causes user difficulty but the task can be completed (non-critical
error)
90
• Low - minor problems that do not significantly affect the task completion (non-critical error)
2.9.3. Frequency Frequency is the percentage of participants who experience the problem when working on a task.
• High: 30% or more of the participants experience the problem • Moderate: 11% - 29% of participants experience the problem • Low: 10% or fewer of the participants experience the problem
2.9.4. Problem Severity Classification The identified severity for each problem implies a general reward for resolving it, and a general risk for not addressing it, in the current release.
Severity 1 - High impact problems that often prevent a user from correctly completing a task. They occur in varying frequency and are characteristic of calls to the Help Desk. Reward for resolution is typically exhibited in fewer Help Desk calls and reduced redevelopment costs.
Severity 2 - Moderate to high frequency problems with moderate to low impact are typical of erroneous actions that the participant recognizes needs to be undone. Reward for resolution is typically exhibited in reduced time on task and decreased training costs.
Severity 3 - Either moderate problems with low frequency or low problems with moderate frequency; these are minor annoyance problems faced by a number of participants. Reward for resolution is typically exhibited in reduced time on task and increased data integrity.
Severity 4 - Low impact problems faced by few participants; there is low risk to not resolving these problems. Reward for resolution is typically exhibited in increased user satisfaction.
91
3. Compatibility Testing 3.1. Browser Compatibility:
Some applications are very dependent on browsers. Different browsers have different
configurations and settings that the web page should be compatible with. The web
site coding should be cross browser platform compatible. If the site is using JavaScript
or AJAX it calls for UI functionality, performing security checks or validations then give
more stress on browser compatibility testing of the web application. Test web
application on different browsers like Internet explorer, Firefox, Netscape navigator,
AOL, Safari, Opera browsers with different versions.
3.2. OS Compatibility:
Some functionality in the web application may not be compatible with all operating
systems. All new technologies used in web development like graphics designs,
interface calls like different API’s may not be available in all Operating Systems. Testing
the web application on different operating systems like Windows, Unix, MAC, Linux,
Solaris with different OS flavors.
3.3. Mobile Browsing:
This is new technology age. Mobile browsing will be the future for Internet browsing.
Testing the web pages on mobile browsers is highly important. Compatibility issues may
be there on mobile. Currently ,the system is not designed for mobile browsing,
although this is an area we can add in future.
3.4. Printing Options:
Website page-printing options: make sure fonts, page alignment, page graphics get
printed properly. Pages should fit to paper size or as per the size mentioned in printing
option.
4. Performance Testing:
Web application should sustain to heavy load. Web performance testing should
include: Web Load Testing & Web Stress Testing
92
4.1. Web Load Testing:
Test application performance on different Internet connection speeds. In web load
testing test if many users are accessing or requesting the same page. Can system
sustain in peak load times? Site should handle many simultaneous user requests, large
input data from users, Simultaneous connection to DB, heavy load on specific pages
etc.
4.2. Stress Testing:
Generally stress means stretching the system beyond its specification limits. Web stress
testing is performed to break the site by giving stress and checked how system reacts
to stress and how system recovers from crashes. Stress is generally given on input fields,
login and sign up areas.
In web performance testing web site functionality on different operating systems,
different hardware platforms are checked for software, hardware memory leakage
errors.
5. Security Testing
• Test by pasting internal URL directly into browser address bar without login. Internal
pages should not open.
• If you are logged in using username and password and browsing internal pages
then try changing URL options directly. I.e. If you are checking some publisher site
statistics with publisher site ID= 123. Try directly changing the URL site ID parameter
to different site ID, which is not related to, logged in user. Access should deny for
this user to view others stats.
• Try some invalid inputs in input fields like login username, password, and input text
boxes. Check the system reaction on all invalid inputs.
• Web directories or files should not be accessible directly unless given download
option.
• Test the CAPTCHA for automates scripts logins.
• Test if SSL is used for security measures. If used proper message should get displayed
93
when user switch from non-secure http:// pages to secure https:// pages and vice
versa.
• All transactions, error messages, security breach attempts should get logged-in log
files somewhere on web server.