![Page 1: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/1.jpg)
L643: Evaluation of Information Systems
Week 7: February 18, 2008
![Page 2: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/2.jpg)
2
A Toolkit for Strategic Usability (Rosenbaum et al., 2000)
Most used organizational approaches or usability methodologies: Heuristic evaluation (70%) Lab usability testing (65%) Fit into current engineering processes (63%) Task analysis (62%)
![Page 3: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/3.jpg)
3
A Toolkit for Strategic Usability (Rosenbaum et al., 2000)
More extensive use of approaches and methodologies Usability test w/ portable lab equipment UI staff members co-located with engineering Field studies High-level/founder support Usage scenarios Participatory design
![Page 4: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/4.jpg)
4
A Toolkit for Strategic Usability (Rosenbaum et al., 2000)
Less extensive use of approaches and methodologies Educate/train other functional groups Focus groups Surveys (52%) Corporate mandates/ usability objectives UI group resorts to UI, not development
![Page 5: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/5.jpg)
5
A Toolkit for Strategic Usability (Rosenbaum et al., 2000)
Examined the relationship between: Effectiveness ratings and % of reporting use (Figure 1) Size of organizations and usability methods Types of companies and how successful respondents from
these companies rate organizational approaches and usability methods?
Hypothesizes Do usability consultancies rank some or all usability
methods more effective than do in-house usability professionals? [Yes]
Do smaller companies have a better focus on their customer populations, and thus find contextual inquiries and task analysis more effective? [No]
![Page 6: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/6.jpg)
6
Website Usability (Palmer, 2002)
Web usability (Nielsen, 2000) Navigation Response time Credibility Content
Media richness (Daft & Lengel, 1986, etc.)
![Page 7: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/7.jpg)
7
Website Usability (Palmer, 2002)
Hypo 1: websites exhibiting lower download delay will be associated w/ great perceived success by site users
Hypo 2: more navigable websites will be associated with greater perceived success by site users
Hypo 3: higher interactivity in websites will be associated with greater perceived success by site users
![Page 8: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/8.jpg)
8
Website Usability (Palmer, 2002)
Hypo 4: More responsive websites will be associated with greater perceived success by site users
Hypo 5: higher quality content in websites will be associated with greater perceived success by site users.
![Page 9: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/9.jpg)
9
Website Usability (Palmer, 2002)
In this article, how did he collect data? Is it appropriate? What’s the research design? What are the findings? What are the implications? Are there any problems with this study?
![Page 10: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/10.jpg)
10
Are Wiki Usable? (Désilets, et al., 2005)
Quasi-ethnographic methods In-session data:
Observing subjects asking Qs Recorded interactions with the instructor
Post-session data: Inspecting the subjects’ work
![Page 11: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/11.jpg)
11
Are Wiki Usable? (Désilets, et al., 2005)
A-priori categories in severity: Catastrophe Impasse Annoyance
Bottom-up classification of events: Hypertext Link creation and management Image uploading Creating/editing pages Hypertext authoring, etc.
![Page 12: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/12.jpg)
12
Updated D&M IS Success Model (2002, 2003)
InformationQuality
System Quality
ServiceQuality
IntentionTo Use
Use
UserSatisfaction
NetBenefits
Creation Use Consequences
![Page 13: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/13.jpg)
13
Updated D&M IS Success Model (2002, 2003)
InformationQuality
System Quality
ServiceQuality
IntentionTo Use
Use
UserSatisfaction
NetBenefits
Creation Use Consequences
![Page 14: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/14.jpg)
14
IS Effectiveness: A User Satisfaction Approach (c.f., Thong & Yap, 1996)
Criticisms for user satisfaction: Questionable operationalizations of the user
satisfaction construct Poor theoretical understanding of the user
satisfaction construct Misapplication of user satisfaction instruments
![Page 15: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/15.jpg)
15
IS Effectiveness: A User Satisfaction Approach (c.f., Thong & Yap, 1996)
Existing literature: Organizational effectiveness
No strong model of organizational effectiveness No agreement on its measurement
Information systems effectiveness Difficulty of measuring org effectiveness measuring
system usage user satisfaction User satisfaction
LOTS of criticisms on previous measurement Similarity between user satisfaction and the social &
cognitive psychologists’ notion of an attitude
![Page 16: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/16.jpg)
16
IS Effectiveness: A User Satisfaction Approach
Definition of satisfaction: the extent to which users believe the IS available to them meets their information requirements
Assumption: if you are satisfied with the system, it is increasing your effectiveness
Based on assumptions: workers are rational and want to be effective
![Page 17: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/17.jpg)
17
Theory of Reasoned Action(Fishbein & Ajzen, 1975)
Beliefs aboutconsequencesof behavior X
Normative beliefs about
behavior X
Attitude toward
behavior X
Subjective normconcerning behavior X
Intention toperform
behavior XBehavior X
![Page 18: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/18.jpg)
18
End-user Computing Satisfaction: Figure 1 (Doll & Torkzadeh, 1988; 1994; Doll et al., 2004)
End-userComputingSatisfaction
Content Accuracy Format Ease of use Timeliness
C1
C2 C3
C4
A1 A2
F1 F2
E1 E2
T1 T2
![Page 19: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/19.jpg)
19
End-user Computing Satisfaction: Figure 3 (Doll & Torkzadeh, 1988; 1994, 2004)
This model is robust, i.e., it can be used to measure/compare different subgroups (hypothesis-1)
Some difference in structural weights (hypothesis-2; Table 5)
![Page 20: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/20.jpg)
20
User Satisfaction with Knowledge Management System (Ong & La, 2004)
21-item questions that include: Knowledge content (5Qs) Knowledge map (4Qs) Knowledge manipulation (4Qs) Personalization (4Qs) Knowledge community (4Qs)
5 global items Intention to use, intention to recommend Overall satisfaction, success of KMS
![Page 21: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/21.jpg)
21
Questionnaire for User Interaction Satisfaction (QUIS) QUIS 7.0 (http://www.cs.umd.edu/hcil/quis/):
A demographic questionnaire 6 scales to measure reactions of the system 4 measures of interface factors:
Screen factors Terminology & system feedback Learning factors System capabilities
See the questions at: http://www.otal.umd.edu/SHORE2000/telemenu/Survey.htm
Variations2’s use of QUIS (http://variations2.indiana.edu/pdf/var-sat-survey.pdf)
![Page 22: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/22.jpg)
22
Activity According to Nevo and Wade (2007), 29% of IT
project failed, and another 53% were challenged Suppose you work for IU COAS IT department. The
dean of COAS decided to install a new system that would facilitate application process for graduate students because the current system doesn’t work quite well. You talked to the vendor of the software, and they mentioned that other schools have already implemented the system and were happy with it.
Come up with a plan that would reduce “disappointment” within COAS when you introduce this new system
Make sure to justify your decisions
![Page 23: L643: Evaluation of Information Systems Week 7: February 18, 2008](https://reader036.vdocuments.us/reader036/viewer/2022062321/56649f155503460f94c2b675/html5/thumbnails/23.jpg)
23
More Instruments for User Satisfaction
For electronic health record: http://www.aafp.org/ehrsurvey.xml
For college computing (our own UITS): http://www.indiana.edu/~uitssur/