evaluation of hospital information...

85
EVALUATION OF HOSPITAL INFORMATION SYSTEMS Chiotaki Nikomacheia MSc Thesis Submitted in partial fulfillment of the degree of Master of Science in Finance and Financial nformation Systems University of Greenwich SEPTEMBER 2005

Upload: lekhanh

Post on 18-Jul-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

EVALUATION OF HOSPITAL INFORMATION SYSTEMS

Chiotaki Nikomacheia

MSc Thesis

Submitted in partial fulfillment of the degree

of Master of Science

in Finance and Financial nformation Systems

University of Greenwich

SEPTEMBER 2005

Page 2: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

4

ABSTRACT

Introduction: Nowadays the adoption of hospital information systems plays

critical role in advanced health care delivery, reduction of medical error and

promoted patient care. Evaluation of hospital information systems is

mandatory for its successful adoption. In the hospital environment the

evaluation of hospital information systems is difficult due to the several factors

that are involved. One of these factors, of special importance, is user

satisfaction.

Purpose: To evaluate the level of satisfaction of users in the Kavala’s

hospital in Greece, where the information technology infrastructure is in early

stages.

Materials and Methods: For the purpose of this study the System Usability

Scale questionnaire was used, which has not ever used before for evaluation

in hospital environment. The participants of the study were worked in the

Kavala’s Hospital, which is placed in the homonymous Greek city. Users were

divided in two groups: the administrative group containing the administrative

employees-participants of the specific hospital and the health care group

containing the health-related professionals occupied in the same hospital.

Data Analysis: Users scored the existing information system as fairly

usable. Users’ opinions related to system’s frequency of use, easy to use and

learn, inconsistency, integrity, technician’s support are discussed. The results

of this pilot study showed significant correlations between several factors in

each of the groups, which are discussed. Special importance has the

interrelations among Confidence- Integrity- Frequency of use and Integrity-

Complexity-Inconsistency, observed in both groups.

Conclusion: The SUS proved an effective questionnaire for the evaluation

of user satisfaction in hospitals. The newly designed “CIF” (Confidence-

Integrity- Frequency of use) and “ICI” (Integrity-Complexity-Inconsistency)

triangles are are described with good assessment of results.

Page 3: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

5

ACKNOWLEDGEMENTS

I wish to thank the patients and staff who helped with this study. I would like

also to express my gratitude to my fiancé, parents and parents in law for their

support.

‘’Except for the help listed in the Acknowledgements, the contents of this

thesis are entirely my own work. This work has not previously been submitted,

in part or in full, for a degree or diploma of this or another University or

examination board’’.

Chiotaki Nikomacheia

September 2005

Page 4: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

1

CONTENTS

ABSTRACT 4

ACKNOWLEDGEMENTS 5

CHAPTER ONE - LITERATURE REVIEW

1. LITERATURE REVIEW

1.1. INTRODUCTION 6

1.2. THE HOSPITAL INFORMATION SYSTEMS-

GENERAL DESCRIPTION 7

1.2.1. General Description 7

1.2.2. The structure of the Hospital Information System 7

1.2.3. The Functional Requirements of a Hospital Information System 8

1.2.4. Benefits of Hospital Information System Application 9

1.2.5. The Clinical Information System 9

1.3. INFORMATION FLOW IN THE HOSPITAL 10

1.3.1. Information Sharing 10

1.3.2. Handheld devices 10

1.3.3. Shared Decision Making 11

1.3.4. Clinical Decision Support Systems 11

1.3.5. Computerised Physician Order Entry 11

1.3.6. Electronic Health Record 12

1.3.7. Electronic Patient Record Systems 12

1.4. INFORMATION TECHNOLOGY IN HOSPITALS 13

1.4.1. Information Technology and Error Reduction 13

1.4.2. Integration of Hospital Information Systems 15

1.5. EVALUATION OF HOSPITAL INFORMATION SYSTEMS 15

1.5.1. Evaluation to Health Care Organisations 15

1.5.2. Factors Affecting Evaluation 17

1.5.3. Evaluation Planning 19

1.5.4. Evaluation Methods 21

1.5.4.1. Formative and Summative methods 21

1.5.4.2. The Objective or Quantitative Method and the Subjective or

Qualitative Method 22

1.5.4.3. Randomised Controlled Trials 23

1.5.5. Models Referred for Evaluation 24

1.5.5.1. The Socio-Technical model 24

Page 5: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

2

1.5.5.2. The Technology Acceptance Model 25

1.5.5.3. The Task Technology Fit Model 25

1.5.5.4. Disconfirmation Theory and Dissonance Theory 26

1.5.5.5. Despont-Gros, Mueller and Lovis Model for User Evaluation 26

1.5.5.6. Validation of Telematic Applications in Medicine Guidelines 27

1.5.6. Problems in Evaluation of Information Technology in Health Care 27

1.5.6.1. Insufficient Evaluation Methods, Guidelines and Tools 27

1.5.6.2. Complexity of Evaluation Object 28

1.5.6.3. Conflicting Evaluation Questions 28

1.5.6.4. Funding and Number of Participants 29

1.5.6.5. Conflicts 30

1.5.6.6. Organisational Resistance 30

1.5.6.7. Training- Education of Health Care Professionals 31

1.5.6.8. Methodological Issues 31

1.5.7. Recommendations 32

1.5.7.1. Insufficient Evaluation Methods, Guidelines and Tools:

Recommendations 32

1.5.7.2. Complexity of Evaluation Object: Recommendations 32

1.5.7.3. Conflicting Evaluation Questions: Recommendations 32

1.5.7.4. Number of Participants and Funding: Recommendations 32

1.5.7.5. Conflicts: Recommendations 33

1.5.7.6. Training- Education of health care professionals:

Recommendations 33

1.5.7.7. Methodological Issues: Recommendations 34

1.5.8. Information System Effectiveness and Success 35

1.5.9. User acceptance 36

1.5.10. User Satisfaction 38

1.5.10.1. Variables Affecting User Satisfaction 39

1.5.11. Questionnaires as Measure for User Satisfaction 40

CHAPTER TWO – MATERIALS AND METHODS

2. MATERIALS AND METHODS 42

2.1. MATERIALS 42

2.1.1. Kavala Hospital 42

2.1.2. System Usability Scale 43

2.1.2.1. Using the System Usability Scale 43

Page 6: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

3

2.1.2.2. Scoring the System Usability Scale 44

2.1.3. Consent Form and Information Sheet 44

2.2. METHODS 45

2.2.1. Data Protection 45

2.2.2. Selection Criteria 45

2.2.3. Statistical Methods 45

CHAPTER THREE – DATA ANALYSIS

3. DATA ANALYSIS 46

3.1. SUS Score 46

3.2. Frequency of Use 48

3.3. Complexity 50

3.4. Easy to Use 52

3.5. Need for Support from Technician 54

3.6. Functional Integrity 56

3.7. Inconsistency 58

3.8. Quick Learning of the System 60

3.9. Cumbersome to Use 62

3.10. Confidence 64

3.11. Need to Learn Before the Use of the System 66

3.12. Correlations 68

CHAPTER 4 – DISCUSSION

4. Discussion 69

4.1. General Considerations 69

4.2. Correlations 70

4.3. Conclusion 71

APPENDICES

SUS Questionnaire 74

Information Sheet 75

Consent Form 76

Page 7: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

6

1.1. INTRODUCTION

Hospital Information Systems (HISs) contribute to an efficient patient care

with high quality (Heeks, 2005a) and comprise data transfer with the associated

hospital employees, at the right place and time, promoting interoperability among

them (Winter et al., 2003). In other words, HIS is principally focused on the

patient, as well as on medical and nursing care, and the administrative and

management issues needed to support these kinds of care (Heeks, 2005a).

The HIS offers indisputably significant opportunity for the development of the

efficacy and the efficiency of the health care (Jaspers et al., 2004a) through their

frequent application in Medical Informatics (Pietka, 2003). The implementation of

HIS affects the structures, the processes and the outcomes in the health care

environment (Despont-Gros, Mueller and Lovis, 2005).

The introduction of methods and tools in order to support homogeneity and

accountability of healthcare decisions and actions is important (Kalogeropoulos,

Carson and Collinson, 2003). However, the increasing adoption of information

technology (IT) in patient care necessitates the establishment of reliable

evaluation of information systems (Lee, 2004). Nevertheless, the first issue in any

evaluation is, the key questions to cover all relevant perspectives (Wyatt and

Wyatt, 2003).

Furthermore, for the successful installation and adoption of HIS, structured

assembling of user’s needs and system requirements is essential (Staccini et al.,

2005). Therefore, user’s opinion and satisfaction is fundamental for the

successful adoption of HIS (Wu and Wang, 2005).

The purpose of the study is to evaluate the adoption of an innovative HIS in a

Greek hospital, stated in Kavala city, as far as concerns user satisfaction.

Currently, the existing HIS in the specific hospital is in early stages. This effort

concentrated to users’ opinion for the existing information system, examining

several aspects via the SUS questionnaire, which has not ever been used for the

evaluation of user satisfaction in a hospital environment.

Page 8: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

7

1.2. THE HOSPITAL INFORMATION SYSTEMS

1.2.1. General Description

A HIS is the socio-technical subsystem of a hospital (Brigl et al., 2005), using

a database applied system based on the modularised structure of Browse/Server

(Chang et al., 2003). Heeks (2005a) describe health information systems, as

systems for processing data, information and knowledge in health care

environments and HISs are just one category of health information systems, with

a hospital as health care environment.

According to Brigl et al. (2005), these systems support information

management by:

- Combining the hospital aims with the defined targets of its strategic

information management,

- Promoting paper-based and computer-based information processing,

- Identifying gaps in services, which determine the tactical strategy to

be accomplished.

The three aspects of an innovative Health Information System are the patient

data management (through an Electronic Patient Record), the medical decision

support (through a Guideline Management System) and the organisational

support (through a Workflow Management System) (Ciccarese et al., 2005).

However, its successful maturity and application demands working within an

information partnership to maximise coordination, collaboration and cooperation

(Maybloom and Champion, 2003).

1.2.2. The Structure of the Hospital Information System

The Health Information Systems are currently varied from distributed (based

on messaging approach, Grid-like linking approach or are portals with

Clearinghouse functions) to centralised options, which have one patient

repository (Ruotsalainen, 2004). According to Pietka (2003), in order to make a

clear structure of Hospital Information System, the function of hospital information

module and its implementation to the several hospital departments have to be

Page 9: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

8

taken into account. Therefore, the overall structure of a Hospital Information

System is consisted of:

- Patient-oriented modules

Kernel modules, which comprise patient admission, discharge and

transfer, registration of medical activities, registration of diagnoses and

therapy, order entry, access to patient data.

Stand-alone modules designed for ancillary departments and applied

to clinical departments with specific operational requirements, different

from those of general hospital orientation, such as radiology and

clinical laboratories.

- Hospital-oriented modules

Administrative modules

Finance and billing modules

Management information and decision-support module (Pietka, 2003).

Good interface design requires a deep understanding of work practices to

adequately represent these practices in system design specifications (Jaspers et

al., 2004b).

1.2.3. The Functional Requirements of a Hospital Information System

The processes that should be supported by a HIS are defined by Gell et al.

(2000), in a systematic list, in which the structure has the following order: core

processes, process fields, processes and sub-processes or functions. Based on

information managers’ needs, Winter et al. (2003), have assumed the

requirements on a meta model for HIS and its management, which are:

Besides software and hardware components, conventional tools have to

be modelled.

Information has to be modelled as entity types used by enterprise

functions.

Entity types should be represented in form of datasets, forms and

messages.

Page 10: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

9

Relationship modelling for enterprise functions, as well as for tools

(software, hardware and conventional tools) should be considered and

their interworking should be in accordance with these relationships.

Communication sequences modelling are required for the successful

interchange and communication of information among components.

1.2.4. Benefits of Hospital Information System Application

The benefits, pending from the application of a Hospital Information System

are several, such as facilitation of the information sharing (knowledge

management)(Kalogeropoulos, Carson and Collinson, 2003), compatibility, mass

archives, security, high reliability, simple operation and support of medical

information formats of images, figures and texts (Chang et al., 2003). The

technology-enabled clinical management also contributes to cost controlling,

quality of care acquirement and the rapid translation of biomedical research into

patient care (Ball, 2003). On the other hand, Hospital Information Systems are

recommended to support the scientific homogeneity and accountability of

healthcare decisions and actions; to contribute to overall reduction in cost,

improved quality of care and patient satisfaction (Kalogeropoulos et al., 2003).

Finally, via Hospital Information Systems, and their applications, the taken clinical

decisions are more appropriate and medical errors are avoided (Johnson et al.,

2004).

1.2.5. The Clinical Information System

A clinical information system (CIS) includes order entry and reporting systems,

electronic patient records, telemedicine and decision support tools for health

professionals, patients and the public (Wyatt and Wyatt, 2003). The purpose of a

CIS, as part of the hospital information system, is to provide direct access to

clinical information, immediate and easy storing of new information and decision

support (Feied et al., 2004). Clinical information database is essential due to

integration among health-related specialties in order to achieve optimisation of

care (Dziuban, 1999). Clinical systems should require two at maximum steps for

Page 11: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

10

information access and should also be rapid especially in emergencies, where

first seconds are critical; to avoid slower systems, which may lead to adverse

events and as well (Feied et al., 2004).

The functional integration of CISs at institutional or regional level is based on

primarily on the exchange of Health Level 7 and Digital Imaging and

Communication in Medicine (DICOM) messages (Katehakis et al., 2001).

According to Tsiknakis et al., the promising tendency concerns an autonomous

CIS and a modular underlying health information infrastructure offering facilitating

services to the distributed clinical data of a patient.

1.3. INFORMATION FLOW IN THE HOSPITAL

1.3.1. Information Sharing

The knowledge sharing and its exploitation are encouraged for the

development of research and evidence-based medicine (Loef and Truyen, 2005).

The information sharing through Electronic Patient Record support also clinical

research, population health, health administration, financing and health service

planning (Takeda et al., 2000). On the other hand, the collection, analysis and

exchange of clinical, billing, and operational data within the organisational welfare

influences functionality of health care services (Bose, 2003). Therefore,

knowledge management systems should be considered too, as through the

improved knowledge sharing and creation, receptive costumer services and

advanced patient care could be obtained with reduction in costs (van Merode et

al., 2004).

Knowledge management has incredible function and importance to the health

care industry, mainly for hospitals and hospital systems, contributing to a more

cost-effective and error-averse system delivery (Guptill, 2005).

1.3.2. Handheld devices

Handheld devices, such as “personal digital assistant”, “handheld”, “palm pilot”

or “pocket pc”, have similar functions and are effectively applied in the healthcare

Page 12: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

11

sector providing and continually improving the access to clinical information

contributing to the healthcare promotion (Lu et al., 2005). Furthermore, handheld

devices promote patient safety and care delivery through access to appropriate

resources (Taylor, 2005).

1.3.3. Shared Decision Making

Shared decision making concerns the processes during which patients are

informed about the potential harms and benefits or options for the treatment and

screening decisions regarding patients preferences (Ruland, 2004a). For the right

health care decision making the patients’ experiences, preferences and principles

are regarded as vital (Ruland and Bakken, 2001). It is also worth mentioning that

information sharing, as well as collection and storing, present the dilemma about

the level of patients’ personal privacy protection and the level of information flow

(Anderson, 2000).

1.3.4. Clinical Decision Support Systems

Computerised Clinical Decision Support Systems are software for decision-

making support using the patient-oriented information in comparison with clinical

knowledge database in order to conclude to recommendations or evaluations

(Rothschild, 2004). These systems are associated with improved practitioner

performance (Garg et al., 2005), medical error reduction (Kaushal et al., 2003)

and evidence-based clinical guidelines application (Kuperman and Gibson,

2003).

1.3.5. Computerised Physician Order Entry

Computerised Physician Order Entry (CPOE) is the process that allows direct

entry of medical orders by the health care decision maker (Kuperman and

Gibson, 2003; Ash et al., 2005) and provides patient care by improving

communication, information access and decision support (Rothschild, 2004). The

CPOE process is demonstrated in Table 1. The effectiveness of these systems

depends on the extent to which they are integrated with systems in clinical

Page 13: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

12

laboratories, radiology and patient records (Ball, 2003). The implementation of

CPOE is not easy but is worthwhile for the intended positive results; to avoid the

negative unintended consequences that may be appeared, the clinicians should

be trained to choose the computerised systems (Kuperman and Gibson, 2003;

Ash et al., 2005). The requirements and costs are presented in Table 2.

Hardware,

Software,

Technical support,

Integration with existing systems,

Extensive information infrastructure, sufficient number of workstations,

Fast, secure and reliable computer network,

Electronic interfaces between CPOE and other applications within the hospital (registration, laboratory pharmacy, radiology and nursing documentation),

Help-desk for technical problem acute execution,

Staff training.

Table 2: Requirements and costs for the successful implementation of CPOE (Kuperman and Gibson, 2003).

1.3.6. Electronic Health Record

Electronic Health Record (EHR) creates complete clinical documentation

representing a rich source of data concerning medical and non-medical patient

information (Blobel, 2004), leading to accurate decision-making and decreased

medical errors (Johnson et al., 2004). The EHR structure should provide

reliability between doctors and doctor to patient cooperation and communication

based on the patient’s consent (Blobel, 2004).

1.3.7. Electronic Patient Record (EPR) Systems

Electronic Patient Record (EPR) Systems are powerful to support all the

health-related information, legal aspects and the wide variety of hospital

uses enter order linked to Clinician CPOE Computer Workstation Clinical Hospital Information System

Table 1: CPOE permits Clinician to enter order directly to a computer workstation, which is linked to a hospital clinical

information system (Rothschild, 2004) execution

Page 14: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

13

information systems, contributing to professional’s time saving and total labour

cost (Monteiro, 2003). The two approaches developed for EPR architectures, as

Takeda et al. (2000) presented in their study, are demonstrated in Table 3.

Document-oriented methodology Object-oriented methodology

Use of Internet

Use of Mark-up languages for standarisation of electronic representation of paper-based health care documents and forms

Documents are written to conformation to a particular (DTD).

Use of XML DTDs.

Use of network technology

Characteristics from HL7 Reference Information Model (RIM), CEN, Distributed Combonent Object Model (DCOM), CORBA and CORBAmed and GEHR

Object dictionary for translation of definitions of data that are going to be transferred.

Table 3: Characteristics of the two major approaches for EPR data architecture (Takeda et al., 2000).

1.4. INFORMATION TECHNOLOGY IN HOSPITALS

1.4.1. Information Technology and Medical Error Reduction

Information technology can be applied in medication error limitation both at

inpatients and outpatients cases (Kaushal and Bates, 2002). On the other hand,

the cost-effectiveness of most proposed improvements in error reduction and

patient safety remains unknown (Warburton, 2005). Although IT interventions are

expensive and demand the training of the hospital staff, the savings and the

advantages are probably greater in most of the cases (Kaushal and Bates, 2002;

Edwards and Moczygemba, 2004). The several applications of IT in hospitals are

presented in Table 4.

According to Simpson (2004), IT implementation is depended on:

How well the system is managed from the users,

The well designed process (structure, meeting organisation’s objective),

and

If the computer program is adopted harmonically from the organisation

(technologically and socially).

Based on increased demands on healthcare services, the health information

systems tend to be modified in order to offer a shared caring concept and dilated

network, promoting cooperation and communication among direct and indirect

Page 15: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

14

care providers -and also among different healthcare institutions-, combined with

the minimum competitiveness (Blobel and Holena, 1998).

Table 4: Several applications of facilitating the information flow IT in hospitals (Kaushal and Bates, 2002).

INFORMATION TECHNOLOGY APPLICATION IS HOSPITALS

Computerised Physician Order Entry

(CPOE)

Improves the drug ordering process

Order regulation

Legible

Complete

Provide additional information to physician for the medication in accordance to the patient case.

Can be combined with the Computerised Decision Support System (CDSS) or the Computerisation of the medication administration record effectively.

Computerised Decision Support System (CDSS)

Basic mode provide information for drug selection, dosing and duration.

Advanced mode offer additional patient-specific and pathogen-specific information and advices to physicians.

Computerisation of the

medication administration record

In combination with CPOE decrease the medication errors that happen in the transcribing stage.

Automated dispensing

Robots can be used for the automation of drug ordering, transcribing and dispensing stages.

Automated drug distribution

systems

Contain computer-controlled devices.

Use for packaging, dispensing and distributing medication.

Medication bar coding

Drug name, drug dose and administration time identification.

Staff and patient name identification.

“Smart” intravenous devices

Used in cases of intravenous medication usage.

Devices with simplified programming and computerized checks.

Reduce the intravenous medication errors.

Dose controlling.

Computerised discharge

prescriptions and instructions

Provide easy access to inpatient, outpatient and emergency room settings.

Personal Digital Assistant

Provide immediately all the up-to-date information for the patient.

Reduce medication errors.

Easy to use.

Inexpensive.

Page 16: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

15

1.4.2. Integration of Hospital Information Systems

Since the HIS reflects the heterogeneity of a hospital, the need for integration

for the concentrated internal communication among organisational units and

health-related professionals is intensive (Winter et al., 2003). HISs must be

integrated in accordance with the hospital’s organisational structure (Ball, 2003)

and therefore due to RIS and PACS systems in order to be inoperable (Chang et

al., 2003). Additionally, the central planning for the decision process and the

control system for the assurance targets are essential as part of the integration

(van Merode et al., 2004).

According to Monteiro (2003), the integration could take place due to three

dimensions: geographical distribution, heterogeneity and autonomy. The

integration of the healthcare information systems contributes to the diminishment

of the medication errors and the adverse drug events via the systemic discharge

of IT and the upgrading in error reporting (Anderson, 2004) and is necessary due

to different applications of critical value in a large number of sections (Monteiro,

2003). Electronic Patient Record (EPR) Systems are seemed to be the future of

the integrated health information systems (Monteiro, 2003).

Hospital Resource Planning, as application of Enterprise Resource Planning

(ERP) systems in hospitals, enforce hospitals on a more flexible reaction to

changes in the environment (van Merode et al., 2004). These systems are

focused in business factors’ integration such as sales, orders, logistics, inventory,

accounting and personnel (Monteiro, 2003), and also controlling costs through

improved resource management (van Merode et al., 2004).

1.5. EVALUATION OF HOSPITAL INFORMATION SYSTEMS

1.5.1. Evaluation to Health Care Organisations

According to Ammenwerth et al. (2004) "evaluation is the act of measuring or

exploring properties of a health information system (in planning, development,

implementation, or operation), the result of which informs a decision to be made

Page 17: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

16

concerning that system in a specific context". System’s evaluation in biomedical

informatics should be a constant, strategically planned process (Miller, 1996),

assisting the information technology to keep its role; to transform the shape and

the structure of health care practices (Berg, 1999).

According to Dixon (1999), design, implementation and evaluation are

engaged at all stages in a triangle scheme, for the successful adoption of an

information system, where each of the above variables function with the other

two in accordance with the chicken and egg connection. In this triangle,

evaluation is a composite and multidisciplinary process with complicated answers

about the “how” and “what” to evaluate, especially in the health care environment

where the complexity of evaluation is much more evident rather than in other

organisational areas (Despont-Gros, Mueller and Lovis, 2005). As evaluation is

critical to the development and successful integration of knowledge-based

systems (Clarke et al., 1994), some investigators tried to solve the latter problem

providing frameworks, models and tools for the evaluation of health information

systems, and some others tried to modify them. However, this attempt proved

ineffective due to insufficient agreements concerning the different claims among

the investigators (Despont-Gros, Mueller and Lovis, 2005).

Moreover, systems should be assessed in accordance with the standards

defined from the research of activities that suit in the hospital policy and to the

public health, risk management and financial factors (Feied et al., 2004).

Strategic information management is important for maintaining and improving

health information systems preserving privacy and considering the need for new

architectural of HIS due to the new global environment, the extended use of data

including research and the new types of data (Heeks, 2005a). The evaluation of

HIS will be more demanding in the future when recommendations will become

regulations supported by national legal bodies (Ammenwerth et al., 2003a).

On the other hand, the adoption of the modern computerised technology in a

hospital is very expensive and therefore should be examined carefully as any

expenditure without a measurable return on investment (Butler and Bender,

1990). The ability to improve efficiency and outcomes while decreasing costs

Page 18: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

17

through information systems are all potential benefits of a comprehensive clinical

information system. That may be resulted by allowing for multiple and instant

simultaneous access to information, through data monitoring and altering,

through automation of protocols, and by collecting information for population-

based health care as opposed to individual illness-care (Chin and McClure,

1995).

Information system’s success is considered either as a multifactor aspect

depending on context, objectives and stakeholders, or as a one-dimensional

aspect, presented as satisfaction’s surrogate (Despont-Gros, Mueller and Lovis,

2005). It is therefore worth mentioning that health care information system’s

failure is an important theme not only for information management professionals,

but also for the consumers of health services (Beynon-Davies and Lloyd-

Williams, 1999).

1.5.2. Factors Affecting Evaluation

According to Li (1997), the information system success evaluation process

depends on 46 factors (Table. 5) from each and every functional area and from

both health professionals and information systems personnel.

The most important variables for hospital information systems’ evaluation are

the information system’s success, the user acceptance and the user satisfaction

(Despont-Gros, Mueller and Lovis, 2005). Acceptance and satisfaction are

treated as equivalent from some researchers, where from other researchers

acceptance is encountered as a combination of user satisfaction and information

system’s usage (Despont-Gros, Mueller and Lovis, 2005). Satisfaction and

acceptance are also considered as user attitudes, as they influence and lead the

interaction between users and technology (Despont-Gros, Mueller and Lovis,

2005).

Organisational factors are important predictors for dispersal of information

technology innovations as far as individual effect may vary on each innovation

(Ash, 1997). In many cases the organisational issues have been demonstrated

Page 19: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

18

as the most difficult pieces of system’s implementation and operation (Southon,

Sauer, and Dampney, 1999).

FACTORS AFFECTING THE EVALUATION PROCESS

System Quality

Response/turnaround time

Service quality

Training provided to users

Convenience of access User’s understanding of the systems

Features of computer language

Means of input/output with CBIS center

Realisation of user requirements

Vendor’s maintenance support

Correction of errors Processing of requests for system changes

Security of data and models Time required for systems development

Documentation of systems and procedures

Scheduling of CBIS products and services

Flexibility of the system Attitude of the CBIS staff

Integration of the system Technical competence of the Computer Based information System (CBIS) staff

Information

use

Volume of output

Information

quality

Accuracy of output

Conflict

resolution

Competition between CBIS and non-CBIS units

Timeliness of output

Allocation priorities for CBIS resources

Precision of output

Relationship between users and the CBIS staff

Reliability of output

Personal control over the CBIS

Currency of output

Organisational position of the CBIS unit

Completeness of output

User’s attitude toward using CBIS (2)

Format of output

Communications between users and the CBIS staff.

Clarity of output (2)

Individual

impact

User’s expectation of computer based support

Instructiveness of output (2)

Job effects of computer-based support

User

satisfaction

Top management involvement

Perceived utility Charge back method of payment for services

Organizational

impact

Effectiveness of the systems (2)

User’s confidence in the systems

Efficiency of the systems (2) User’s participation

Productivity improved by the CBIS (2)

Support of productivity tools (2)

Table 5: Factors affecting Hospital Information Systems Evaluation (Li, 1997).

Page 20: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

19

Perceived usefulness and perceived ease of use are key determinants of user

acceptance and satisfaction Perceived ease of use is “the degree to which a

person believes that using a particular system would be free of effort”. Perceived

usefulness is “the degree to which a person believes that using a particular

system would enhance his or her job performance (Davis, 1989). Both these

terms affect indirectly the organisational mobility (Zain et al., 2005). The relation

between perceived usefulness and perceived ease of use and system

characteristics affecting the probability of system use, is examined by the

Technology Acceptance Model (TAM) (Legris, Ingham and Collerette, 2003).

Recognising the pragmatic and skilled characteristics of health professionals’

work is a vital aspect in information systems’ evaluation (Berg, 1999). W ithout

effectiveness’s measurement, information system’s assets may be undervalued

or overvalued conflicting the functional strategic planning (Grover, Jeong and

Segars, 1996).

During information systems’ evaluation the environment in which the

information system will be implemented and the users who will use it in their

information process role should be also considered (Ammenwerth et al, 2003a).

1.5.3. Evaluation Planning

Information system implementation is often validated through a cost-benefit

analysis, although this analysis may include assumptions and factors

confounding this procedure (Chin and McClure, 1995). According to Johnson

(2005), information technology planning can be composed of three phases:

The assessment phase: User needs, environmental factors, business

objectives, and IT infrastructure needs are documented and assessed

during this phase.

The prioritisation phase: During this phase the procedures are prioritised

in accordance to:

– Costs

– Benefits

– Risks

Page 21: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

20

– Performance requirements (time and personnel requirements).

The scheduling phase, during which considerations, personnel availability,

and budgetary constraints are scheduled to produce an IT plan in

accordance to organisational goals.

Wyatt and Spiegelhalter (1990) proposed the evaluation of medical expert

systems to be done in two testing stages:

The laboratory testing, in which potential users and

developers’ perceptions are important.

The field-testing during which the study must be designed to

test without predispositions, in order to examine its influence in

structure, process and outcome in the health care delivery.

Clarke et al. (1994), described a development evaluation cycle regarding the

following stages:

(i) The early prototype development,

(ii) The validity of the system,

(iii) The functionality of the system, and

(iv) The impact of the system.

Beuscart-Zephir et al. (1997) supported in their study the necessity of:

– Evaluating the usability of the Information Technology (IT) and

– Understanding the purpose of the management of information,

during the process of evaluation, in order to help the healthcare professionals in

the integration of information management in their daily activity.

In the preliminary phase where Users’ Requirements are taken into account

and developers of information systems must use quality management techniques

to guarantee that the system will persuade given requirements (Beuscart-Zephir

et al., 1997).

Measurements of information system’s assessment include techniques,

medical and health efficacy, economics, sociology, and law and ethics (Grémy

and Degoulet, 1993). According to Ammenwerth et al. (2003a), the main aspects

that should be taken into account during the hospital information systems

evaluation are the following:

Page 22: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

21

Particular attention to information’s technology selection and installation,

Technical and system features, concerning the performance and the

software quality of the selected information system,

User acceptance,

System usability,

System effectiveness in structural and process quality in an enterprise

with a lot of different kinds of users,

System effectiveness in quality of care,

Patient satisfaction with information technology,

Investment and operational cost for the information technology adoption.

The challenge in an information system’s project is the evaluation designing

that capture the complexity of interactions, interrelationships, and inter-effects

occurring during a series of processes, which change the organization, the

people, and the information system involved (Kaplan, 1997).

1.5.4. Evaluation Methods

1.5.4.1. Formative and Summative methods

The formative assessment measures and observes the information resources

themselves in the several stages of development and concerns also the technical

verification of an application (Talmon et al., 1999). Formative evaluation provides

feedback for improvements before the final product is put forth (Currie, 2005).

On the other hand, the summative evaluation regards the measurement and

the performance of the information resources and the behaviour of the people

that use these resources (Talmon et al., 1999). Summative research examines

the impact or the outcomes associated with the use of the system (Currie, 2005).

Therefore, formative and summative are applicable to computer system

evaluation because informatics evaluation takes place both during and after

system development (Currie, 2005).

Page 23: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

22

1.5.4.2. The Objective or Quantitative Method and the Subjective or

Qualitative Method

According to Barbour (1999), qualitative approaches may contribute to

quantitative task and vice versa and their combination is important in health

services’ research.

The objective or quantitative method is used to collect objective data such as

patient waiting times, the number of lab tests ordered per patient or staff

satisfaction on 1–5 scale using quantitative measurement standards, such as

user satisfaction, usage indicators or time studies (Wyatt and Wyatt, 2003).

The subjective or qualitative research techniques provide answers concerning

the reasons and the ways that quantitative studies cannot provide in HIS

implementation and evaluation studies (Ash and Berg, 2003). Qualitative

methods are seemed to be more appropriate for the information systems

evaluation, quantitative measurement standards should be supported on

qualitative data in order their meaning to be conceivable (Berg, 1999). Therefore,

measuring the quantitative aspect of these systems’ improvements is difficult

(Chin and McClure, 1995).

Qualitative methods take into account experiences, emotions, and human-

interaction processes and are appropriate to be used during the formative stage

of system development (Currie, 2005). Qualitative evaluation is also useful in

cases with socio-cultural or political implications and in cases where changes in

tasks, roles and responsibilities have been emerged (Berg, 1999).

Qualitative methodologies are now being seen as methods that might

generate a closer approximation of the validity. Qualitative processes bring

researchers closer to the truth of a domain via the subsequent rich and detailed

analysis of the human experience (Currie, 2005).

The objectivist approach is defined as the method “that employ quantitative

measurement and emphasise experiments” (Friedman and Abbas, 2003).

According to Moehr (2002), the objectivist approach is described through the

following proceedings:

Study question definition

Page 24: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

23

System’s investigation definition

Measurement methods and instruments selection

Demonstration study design

Demonstration study management

Results analysis.

However, the objectivist approach has serious practical and scientific

milestones for the evaluation of information systems in health care.

The subjectivist approach concerns “the scientific methods that employ

qualitative observation of phenomena as they occur naturally” (Friedman and

Abbas, 2003). Moehr (2002) in his study demonstrated that the subjectivist

approach concentrates to:

User’s requirements and queries,

User’s perceptions for the effects of the information system and

the environment in which will be applied,

Careful, detailed and sensitive observations,

Inductive reasoning for understanding the circumstances and

conditions for application, and

Addressing the problem from a different set of premises.

Therefore, according to Moehr (2002), the identical evaluation methodology in

health informatics includes the subjectivist approach quantifiable and the

objectivist approach with more realistic requirements and its drawbacks

compensated by the subjectivist method.

1.5.4.3. Randomised Controlled Trials

Evaluation in medical informatics combine the medicine, informatics and

technology’s discipline (Talmon et al., 1999) and has the tendency to follow the

model of controlled clinical trials, which uses a number of assumptions with

unexamined implications (Forsythe and Buchanan, 1991). However this

tendency, randomised clinical control trials (RCTs) are not indicative for

evaluation with multiple groups participating to the evaluation process (Kaplan,

2001).

Page 25: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

24

The randomised controlled trial (RCT) is the current gold standard for

informatics and biomedical evaluation as preserves rigid objectivity and controls

irrelevant effects (Currie, 2005). However, this method is not appropriate to

answer the “when” and the “how” the systems will be used (Kaplan, 2001). The

RCT, as an evaluation method cannot justify the resources required for the

introduction of an information system (Moehr, 2002); is unable to capture

information required for effective system development during the formative

development process (Currie, 2005). Instead, the RCT is a suitable intervention

that can and should be used for summative evaluation (Currie, 2005).

On the other hand, the RCT is the only reliable method for the size estimation

of small but valuable benefits of any kind of interventions, such as testing new

drugs, surgery and other procedures, as well as for the estimation of the

frequency and severity of their side effects (Moehr, 2002; Wyatt and Wyatt,

2003).

1.5.5. Models Referred for Evaluation

1.5.5.1. The Socio-Technical model

The socio-technical model of evaluation, as a user-oriented approach,

supports that the systematic prognosis and discernment in the daily work

practices in which the information systems will be applied, should precede the

design and implementation (Berg, 1999). The purpose of such a model is to

guarantee quality assurance in medical practice (Harteloh, 2003).

A socio-technical appraisal is possible in case of different outcomes from

evaluating information sharing, as a social process, in which technical and social

aspects are strongly interconnected (Aarts and Berg, 2004). Although this model

is complex (Marx and Slonim, 2003), a socio-technical requirement analysis

assists the system’s developers to shape a detailed description of the

environment surrounding this computer system, emphasising the awareness and

coordination among users in their workplace (Reddy et al., 2003). It is therefore

worth mentioning that research is mandatory in order to show the importance of

Page 26: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

25

socio-technical issues, such as leadership, clinician motivation and

communication (Ash and Berg, 2003).

1.5.5.2. The Technology Acceptance Model

The Technology Acceptance Model (TAM), and the Information Technology

Acceptance Model (ITAM) afterwards, was identified by Davis (1989) and was

based on the perceived usefulness and perceived ease of use factors of user

technology acceptance. This model focuses on the user as individual (Dixon,

1999) predicting and examining the factors that will lead users to either accept or

reject an information system (Despont-Gros, Mueller and Lovis, 2005). For the

achievement of this target, the model demonstrates the aspects (constructs)

where evaluation can be applied (Dixon, 1999), measuring information system’s

usefulness and ease to use (Despont-Gros, Mueller and Lovis, 2005).

Analysis at the individual level has been the prevailing evaluation perspective

(Southon, Sauer, and Crant, 1997). Yang and Yoo (2004), based on Davis’s

TAM, found in their study that cognitive attitude is an important variable to explain

behaviours regarding information system’s usage. On the other hand, Legris,

Ingham and Collerette (2003) supported in their study that although TAM is a

useful model, it should be integrated into an innovative one, including human and

social change aspects.

1.5.5.3. The Task Technology Fit Model

The Task Technology Fit (TTF) model focuses on performance (organisational

and individual) and is a theoretical perspective, interesting for user evaluation.

TTF is supported by the hypothesis that a better combination between user’s

needs and technology leads to the better performance (organisational and

individual), where users themselves can evaluate the level of this combination

(Despont-Gros, Mueller and Lovis, 2005).

Page 27: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

26

1.5.5.4. Disconfirmation Theory and Dissonance Theory

Both these theories are based on the term that satisfaction reflects the gap

between the performance and the expectation about an information system

(Despont-Gros, Mueller and Lovis, 2005).

The dissonance model supports that the unfulfilled expectations of users from

an information system create dissonant ideas that have to be minimised in order

users to maintain consistency to the adoption of satisfaction from their prior

expectations (Despont-Gros, Mueller and Lovis, 2005). Additionally, Liberman

and Förster (2005) showed in their study that dissonance theory reduces post-

decisional spreading of alternatives in cases of repeatable decisions difficulty.

On the other hand, “the disconfirmation theory predicts satisfaction by

expectations perceived by individuals, perceived performance and perceived

disconfirmation; unrealistically high expectations result in lower levels of

perceived benefit than those associated with realistic expectations (Staples,

Wong and Seddon, 2002). The expectancy disconfirmation theory (EDT) has

been successfully used to predict users’ intention to continue using information

technologies. Chiu et al. (2005) proposed an EDT model to examine users’

cognitive beliefs.

1.5.5.5. Despont-Gros, Mueller and Lovis Model for User Evaluation

Despont-Gros, Mueller and Lovis (2005), proposed a model concerning the

interactions between the user and the clinical information system (CIS), on the

human-computer interaction (HCI) basis. This model was refined by existing

models and studies and is focused on user evaluation. Furthermore, it shows the

complexity of the following aspects:

Information system’s characteristics, including input-output devices,

dialogue techniques, computer graphics and architecture

Human/user characteristics, such as attitude towards innovation, level

of use, amount of use and demographic data.

Page 28: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

27

Context of use and environment characteristics. For the adoption of a

new CIS, the context of its use, communication patterns and

manipulation of other existing tools have to be taken into account.

Development process characteristics, involving design, implementation

and evaluation’s characteristics, during which user participation and

involvement are really important.

Impact or outcome of computerisation.

All these aspects contribute to user acceptance and reflect user’s perceptions

(Despont-Gros, Mueller and Lovis, 2005).

1.5.5.6. Validation of Telematic Applications in Medicine Guidelines

Validation of Telematic Applications in Medicine (VATAM) guidelines is a

synthesis of existing methods and methodologies assessing of telematics

applications in medicine. Its main purpose is to provide the proper approach

regarding assessment of information and communication technology in health

care to all potential users, focusing mainly on the questions, in the early stages of

evaluation that need to be answered before the assessment study design

(Talmon et al., 1999). Validation guidelines represented in a usable, easy to

access and informative way, are beneficial for all stakeholders in health

telematics projects (Nykänen et al., 1999).

1.5.6. Problems in Evaluation of Information Technology

in Health Care

1.5.6.1. Insufficient Evaluation Methods, Guidelines and Tools

One of the main barriers to evaluation is the absence of sufficient evaluation

methods, guidelines or tools, combining the technical, the organisational and the

social issues regarding the health information systems (Ammenwerth et al.,

2004). Resources or other means to undertake evaluation studies are difficult to

be found (Rigby, 2001), as there is lack of resources for post-project assessment

or studies that are done may not be published (Friedman and Haug, 2003).

Page 29: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

28

1.5.6.2. Complexity of Evaluation Object

According to Ammenwerth et al. (2003a), the evaluation object of health

information systems is complex and broad due to the fact that concentrates not

only on hardware and software, but also on the information processing.

Therefore, evaluation necessitates the computer technology understanding

combined with the social and behavioural processes. Even if the introduction

period is passed, the evaluation object may steadily being changed (moving

evaluation target) (Ammenwerth et al., 2003a).

Friedman and Haug (2003) have additionally stated the following barriers as

responsible for the complexity of the system’s evaluation:

Diversity of population regarding the system’s evaluation (stakeholders,

developers, evaluators),

Multiple factors which influence effects and are difficult to generalise,

The changing environment confounds any evaluation: both the system

being evaluated and outside factors are changing the way users are doing their

work,

Prolonged period of evaluation until the completeness, in such degree

that the project is changed by the time evaluation is complete,

Due to the continuous changes in the field of informatics, there is the

concern that as soon as the instruments for the evaluation are found, they may

not be any longer useable.

Furthermore, the effects of an information system may be varied in several

departments due to the different factors emerged in these departments,

influencing the results of the evaluation study (Ammenwerth et al., 2003a).

1.5.6.3. Conflicting Evaluation Questions

On the other hand, information technology evaluation becomes more

composite due to the insufficient cooperation among the researchers from

different academic fields and traditions, the different professional groups within

the hospital and the reliance on aspects such as legislation or economic

limitations (Ammenwerth et al., 2003a). Therefore, the evaluation questions may

Page 30: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

29

become conflicting, regarding economic, sociologic, psychological,

organisational, technical, information logistical or clinical aspects, driving to

different and quite complex study designs and evaluation methods, difficult to

manage with limited resources in a given period of time (Ammenwerth et al.,

2003a). For that reason, the creation of clearly defined norms is quite intangible

and consequently, indirect measures, such as user satisfaction, are often

applied, resulting to an incomplete definition of information technology’s benefits

(Ammenwerth et al., 2003a). In many cases, information systems are not

sufficiently integrated, such as in the case of Kenya’s District health systems, in

which information systems were incoherent with no effective central co-ordination

to ensure availability in the information flow (Odhiambo-Otieno, 2005).

1.5.6.4. Funding and Number of Participants

Furthermore, in order an evaluation study to be accomplished, two variables

are mandatory: sufficient funding and sufficient number of participants.

Insufficient number of participants complicates the result extraction of the

quantitative measures (Zielstorff et al., 1997).

It is therefore worth mentioning that without stakeholders’ support and

motivation, the sufficient resources and number of participants in IT evaluation

studies are difficult to be found (Ammenwerth et al., 2003a). According to

Friedman and Haug (2003), in many cases the members of the organisational

management does not aim to evaluation studies, as they are afraid of:

The possibility that the project will not be beneficial for the hospital,

The budget may be insufficient for further activities, in case of a

beneficial project, and

The presence of new needs identified from evaluation studies.

A clinical information system and a hospital information system as well may

cause harm as well as benefit (Friedman and Haug, 2003). However, in case of

unexpected adverse effect the subsequent factors should be assessed and

corrected rather than continue funding to attempt results. If fundamental

Page 31: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

30

underlying factors are not corrected and deeper analysis has not estimated as

necessary, the project will still fail but in additional cost (Littlejohns et al., 2003).

It is hard to distribute the IT cost accurately and there is lack of methodology

for financial evaluation (Friedman and Haug, 2003). Necessarily, the high cost

and the high level of risk of information systems themselves are factors that

seriously should be taken into account, as the large-scale information systems’

application have a 30% failure rate (Southon, Sauer, and Crant, 1997).

1.5.6.5. Conflicts

However, there are conflicts in understanding between those who commit the

system, the developers and users, which should be adequately appreciated

(Littlejohns et al., 2003). HIS stakeholders from the one side and developers from

the other side, should control and manipulate their perceptions and expectations

for information systems development due to the increasing possibility of system

failure if their expectation are unrealistic and cannot be stepped with the existing

health care environment (Heeks, 2005b). Entrepreneurs should be capable to

take a detached view of the cost effectiveness of the intervention (Littlejohns et

al., 2003). In many cases, users’ information requirements are not taken

seriously into account, resulting to the creation of irrelevant systems to the

potential users (Odhiambo-Otieno, 2005). Therefore, systems that do not take

into account social and professional cultures and underestimate the complexity of

routine clinical and managerial processes, are prone to failure (Littlejohns et al.,

2003).

1.5.6.6. Organisational Resistance

In many cases, organisational management perceives evaluation study as a

secondary priority or the evaluation contribution is not valued and therefore

prefers to support funding for other studies more “imperative”. In other cases,

there is a weakness to detect and realise the failures or the mistakes of the

system or managers do not want to see their decisions to be evaluated

(Ammenwerth et al., 2004). Though, the complexities of organisational factors

Page 32: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

31

should be approached in a more sophisticated way (Southon, Sauer, and Crant,

1997).

1.5.6.7. Training- Education of Health Care Professionals

In case of the introduction of an information system, users need a lot of time to

become familiar with this system and fully adopt it (Ammenwerth et al., 2003a).

Due to the tremendous development in health information systems, educational

courses and even programs are needed, in order health care professionals to be

informed and well-educated and able to support these systems in their daily

(Heeks, 2005a). On the other hand, Feied et al. (2004) support that each

prospective information system is preferred to be applicable for basic clinical

functions with little or no formal training. An information system, which demands

intensive training in order to be used, may lead to productivity problems, which

are indicated as poor designed systems (Feied et al., 2004). However, due to

continuous technology implementation, the on-the-job technological training of

the staff should be provided (Li and Benton, 2005). Staff should be trained in

techniques for information production and use (Odhiambo-Otieno, 2005) in the

introduction phase. Cases have been reported that educational efforts that took

part late during the implementation phase have contributed to system’s failure

(Littlejohns et al., 2003).

1.5.6.8. Methodological Issues

One of the main methodological barriers is the lack of access to easy

methodologies (Friedman and Haug, 2003). Evaluation studies are often not

based on theory sufficiently, are inadequately implemented (Ammenwerth et al.,

2004) or existing methods from relevant fields, such as psychology, are not used

(Friedman and Haug, 2003). On the other hand, evaluators are often not

sufficiently trained and therefore are incapable to select the appropriate methods

for the specific case they are asked for (Ammenwerth et al., 2004) and identify

the target of evaluation (Friedman and Haug, 2003).

Page 33: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

32

1.5.7. Recommendations

1.5.7.1. Insufficient Evaluation Methods, Guidelines and Tools:

Recommendations

Guidelines for evaluation should be widely available, in order to strengthen

future evaluation studies and their necessity, not only towards the medical

informatics community, but also towards other individuals dealt with health

information systems and health care delivery (Ammenwerth et al., 2004). The

availability of case studies where evaluation has actually influenced decisions

(positive and negative studies) (Friedman and Haug, 2003) via evaluation

centers, via established networks supporting the exchange of experience, or via

broadly accessible warehouses (Ammenwerth et al., 2004) would be enormously

beneficial.

1.5.7.2. Complexity of Evaluation Object: Recommendations

The evaluation object should be focused on specific aspects of the system and

study questions should be defined after a thorough discussion and agreement

regarding the evaluation targets and criteria (Ammenwerth et al., 2003a). Due to

the complexity of the evaluation object the creation of a group of evaluators who

are cross-trained in informatics and evaluation, and are therefore able to perceive

at least the intrinsic evaluation complexities would be advantageous (Friedman

and Haug, 2003).

1.5.7.3. Conflicting Evaluation Questions: Recommendations

The introduction of new evaluation questions may appear during the study, but

only in case that they will not cause conflicting problems (Ammenwerth et al.,

2003a).

1.5.7.4. Number of Participants and Funding: Recommendations

As far as for the adequate number of participants in the evaluation study, the

liable management should be motivated and the prospective participants could

Page 34: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

33

be methodically guided, providing the opportunity for development (Ammenwerth

et al., 2003a). For clinical trials, multi-centric studies are under consideration due

to the availability of the large number of participants, but under cautiousness for

the difficult study design and the variation between study participants among the

several centre trials (Ammenwerth et al., 2003a).

Evaluation should be sufficiently funded during the planning, development,

implementation and operation of HIS (Ammenwerth et al., 2004). Friedman and

Haug (2003) support that a fixed percentage (approximately the 10%) of the total

budget for the IT’s development expenses should be pre-allocated for evaluation.

1.5.7.5. Conflicts: Recommendations

Evaluation studies should be performed by professional expertises

independently of any conflict and unbiased by any political, managerial or other

kind of pressure, in order to answer the evaluation questions, which have been

set (Ammenwerth et al., 2004).

1.5.7.6. Training- Education of health care professionals:

Recommendations

In many cases, system’s failure is resulted by the managers and developers’

responsibility to look for and learn from lessons from past projects (Littlejohns et

al., 2003). System’s evaluation should be based on a variety of approaches and

methods (Ammenwerth et al., 2004) presenting an overall view (Littlejohns et al.,

2003) and target in providing a comprehensive and accurate picture of the health

situation in the hospital environment. In order this target to be accomplished,

developers and managers should collect information from other health care

providers with similar operational infrastructure (Odhiambo-Otieno, 2005) or

should be aware of the systems that competing organizations had adopted

(Grover, Jeong and Segars, 1996). Additionally, every new manager should be

trained to adopt the evaluation’s survey results and invent strategies in

accordance with user requirements’ and user satisfaction’s ratings (Li, 1997).

Page 35: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

34

1.5.7.7. Methodological Issues: Recommendations

Firstly, the information technology, as well as the environment in which it will

be applied and any aspect that may influence the information technology

implementation should be described elaborately (Ammenwerth et al., 2003a).

Additionally, all the changes emerging during the evaluation and their interaction

with users should be documented thoroughly (Ammenwerth et al., 2003a).

It is also worth pointing out that the ‘evidence-based informatics’ helps put

science into the field (Friedman and Haug, 2003). Additionally, the evaluation

should follow a long-term plan in order users to integrate the new information

technology, taking into account the learning period during the introduction stage

and the changes that may occur in the moving evaluation target (Ammenwerth et

al., 2003a). Therefore, attention should be paid in project teams overseeing the

extended programmes to be in post for the whole period, otherwise projects will

be stepped back (Littlejohns et al., 2003). On the other hand, the development of

new flexible methods for evaluation, using qualitative techniques and allowing

studies to be done quickly would accommodate to effective time saving

(Friedman and Haug, 2003).

Friedman and Haug (2003) proposed in their study, as far as for overcoming

methodological barriers in evaluating health care information systems, the

creation of an evaluation portal that includes:

‘Packaged’ approaches to evaluation with relevant text materials that

explain how to use the tools,

Evaluation instruments that have been validated and used in other

studies, and

Completed reports and case studies (positive and negative).

Page 36: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

35

1.5.8. Information System Effectiveness and Success

The main criteria for measuring information system effectiveness include

system’s usage, user information satisfaction, quality of decision-making,

productivity from cost benefit analysis and system quality. From the above

criteria, system usage and user satisfaction are more eminent measures

(Southon, Sauer, and Crant, 1997). To measure the effectiveness of a medical

information system, the processes that reflect the effects should be formed in an

articulate way in order to be analysed during the evaluation research (Kaplan,

1997). On the other hand, in order a system to be successfully implemented,

developers should assume whether users will use the system or not, in

accordance to the past usage and the external and internal factors influencing

the usage of the information system (Bajaj and Nidumolu, 1998)

Delone and McLean’s classification (1992) for information system

effectiveness and success consists of six elements (system use, user

satisfaction, system quality, information quality, individual and organisational

impact). The above model also supports that the relation between system quality

and information quality leads to system use and user satisfaction and

furthermore, that use and satisfaction encourage an individual impact, which

further leads to the organizational impact (Despont-Gros, Mueller and Lovis,

2005).

Based on the above, Southon, Sauer, and Crant (1997) have further

developed a model for the measurement of information systems’ effectiveness,

consisted of the following aspects:

– Performance assessment (evaluation referent),

– Unit of analysis at the organisational as well at the individual level,

– Evaluation type (process, response and impact).

User involvement is a determinant to system’s success (Igbaria and

Guimaraes, 1994). At this point of view, the role of users must be carefully

considered and more cost-efficient practices are needed for gathering users'

implicit needs and requirements (Kujala, 2003). However, user involvement and

Page 37: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

36

influence in large organizations’ IT development may be impractical (Gefen and

Ridings, 2003).

The most recent HIS Working Group (Heidelberg, Germany, April 2002)

underlined that people ultimately determine the success of a HIS, with a strong

emphasis on the sociological, behavioural, and ethical aspects of the HIS (Giuse

and Kuhn, 2003).

1.5.9. User Acceptance

As Collins’ dictionary defines (p.8), “acceptance of something that you have

been offered is the act of agreeing to use it”. On the other hand, Despont-Gros,

Mueller and Lovis, (2005) defines information system’s acceptance “as an

attitude of users towards an information system or an information technology. It is

a multifactor construction based on an affective and cognitive evaluation of all

components surrounding and influencing the interaction process between a user

and an information system”.

User acceptance is characterised as an important part of development and

evaluation of information systems (Davis, 1993). Due to the increase of

information technology adoption in hospitals, physicians and nurses claim for the

access and management of information to become easier through these systems

(Beuscart-Zephir et al, 1997). The boundary between users and IT personnel

(Gefen and Ridings, 2003) and the resistance by managers (Davis, Bagozzi and

Warshaw, 1989) often reduce the possibility the installed IT to be accepted by the

users.

User acceptance seems to assist in system’s problem identification. Therefore,

the reasons that users accept or reject the systems should be defined, in order to

predict, explain and increase user acceptance (Davis, Bagozzi, and Warshaw,

1989). According to Doll and Torkzadeh (1988) it is more useful to measure the

frequency and the level of the usage of the several functions (width of use),

rather than the number of people that use the system. Travers and Downs (2000)

referred that the differences among user acceptance or user declination regard

organizational cultures, users’ relationship with practices and post-

Page 38: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

37

implementation experiences, as far as the emerging or no of benefits from

system usage.

Many researchers demonstrate user acceptance as reflection of user and

developer’s characteristics harmonisation into the system’s implementation

(Despont-Gros, Mueller and Lovis, 2005). User acceptance plays an important

role to the successful adoption of the information system (Despont-Gros, Mueller

and Lovis, 2005).

User acceptance studies should preferably apply a pre-test and a post-test

design in order to the comparison and the confirmation of collected data before

and after the implementation of systems respectively (Aydin, 1994). Ammenwerth

et al (2003b) appraised the introduction of a computer-based nursing

documentation system in a pretest–posttest intervention study, concentrating on

a questionnaire developed using items from published questionnaires and items

that had been grown for the purpose of their study.

The acceptance of the technological tools seems to be in greater levels in the

young groups of health professionals. Mikulich et al. (2001), found that the

seventy five percent of physicians who implemented the examined information

system graduated from the medical school after the 1990, emphasising the user

acceptance in the young group of doctors. According to Brumini’s et al. study

(2005), younger users with computer science education and with previous

computer experience were more positive towards computers than others.

Users’ opinion for the implemented information system seems to play a critical

role. Relevance, validity, and work are the three important parameters in

describing the way that the users experience the system (Karlsson et al, 1997).

Gadd et al. (1998) used a multi-method formative evaluation to assess clinicians'

views as far as for system’s usability and usefulness. Ahearn and Kerr (2003)

used focus groups to examine the disadvantages and advantages of using

prescribing decision support systems, as well as ways for future improvements in

these systems.

The health care professionals do not feel that information technology

decreases their initiatives in decision-making, as Gardner and Lundsgaarde

Page 39: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

38

(1994) reported in their research regarding physicians and nurses’ attitudes,

using the Health Evaluation through Logical Processing (HELP) clinical

information system.

In addition to the latter key determinants, users' perceptions concerning

stakeholders’ support and the level of organisational support for the system

implementation are also referred as determinants for user acceptance

(Venkatesh et al, 2003).

1.5.10. User Satisfaction

According to Collins dictionary’s definition (p.1286), “satisfaction is the

pleasure that you feel when you are doing or have done something that you

wanted or needed to do”.

User satisfaction is one evaluation mechanism for determining system

success (Wu and Wang, 2005), or may acts as a subjective measure of

information system’s success (Despont-Gros, Mueller and Lovis, 2005).

Additionally, as has been mentioned before, user satisfaction is also used as

success’s surrogate (Li, 1997; Despont-Gros, Mueller and Lovis, 2005).

According to the results of Doll and Torkzadeh’s survey (1988) the five

components for end-user satisfaction’s measures are content, accuracy, format,

ease of use, and timeliness. In Zielstorff’s et al. (1997) project, although

preliminary results show no effects on knowledge and clinical decision-making

skills, the system was rated positively for user satisfaction. Bailey (1990)

described a technique included extensive empirical tests and comparison

standards for hospital computer user satisfaction, in order to measure and

manage user approaches through the aspects of computer systems and

therefore to encourage the effectiveness of those systems. Dupuits and Hasman

(1995), based on Bailey’s approach, found that user satisfaction is closely

correlated with the way that software works and facilitates its users.

User satisfaction and user attitude are especially correlated, as the latter is the

affective evaluation of the system by its user (Barki and Hartwick, 1994). Bindels

et al. (2003) investigated generals’ practitioners attitudes based on their

Page 40: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

39

experiences, underlining the attention that has to be paid to the promotion and

the adoption of the accepted national and regional guidelines, and the motivation

of an optimistic attitude in the direction of the practice guidelines among the

users in the daily practice. It is also worth mentioning that user satisfaction

among the users in several departments is fluctuated, due to the differences in

working processes in these departments (Ammenwerth et al., 2003c).

1.5.10.1. Variables Affecting User Satisfaction

According to Bailey and Pearson’s (1983) information system success

instrument, user satisfaction is affected by:

– Top management’s interest, support and participation,

– User’s confidence and certainty about the systems provided,

– User’s participation toward the functioning of the computer-

based information systems and services.

Li (1997) has added to the former variables of Bailey and Pearson’s

instrument the support of productivity tools, emphasising that the quality and the

quantity of the computer software, hardware and peripheral devices, which

support organisation’s functions, as an important factor for user’s satisfaction.

In studies regarding the assessment of attitudes, physicians are consent that

care combined with information technology is better than standard care (Mikulich

et al., 2001) and are willing to adopt an innovative information system only when

their effort for the implementation could have an additional value improving their

productivity and performance (Vlahos and Ferratt, 1995), and efficacy of the

workflows without adverse effects in patient care (Gadd and Penrod, 2001).

User satisfaction is affected by perceived benefit and expectations

characteristics such as perceived usefulness, ease to use and user expectations.

Ease to use is important but usefulness is much more important (Davis et al.,

1989). On the other hand, Davis, Bagozzi, and Warshaw (1989) reported in their

study that perceived usefulness influences peoples' intentions strongly, where

perceived ease of use effects slightly on their intentions.

Page 41: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

40

Expectations on a system should be kept to a realistic level. Users with

unrealistic expectations when they get an accurate picture of the information

system, they become dissatisfied and may discontinue using the system (Szajna

and Scamell, 1993).

User satisfaction is affected by user background, experience, skills and

involvement (Mahmood, et al., 2000), as the way users deal with computer

technology is a core key to the success or failure of the whole system (Grémy,

2005). At this point, user participation and user involvement have to be

distinguished. User involvement regards the importance and personal relevance

of a system to its user (Barki and Hartwick, 1994) and the perception that user

should be included in the development process (Despont-Gros, Mueller and

Lovis, 2005). User participation concerns the activities performed by the user

during the development process (Barki and Hartwick, 1994; Despont-Gros,

Mueller and Lovis, 2005).

User involvement affects positively the user satisfaction (Kujala, 2003). On the

other hand, user participation and user satisfaction during system development

are significantly correlated (McKeen and Guimaraes, 1997), due to a sense of

contribution and supervising, undertaking initiatives toward the system and better

understanding of system’s capabilities (Baroudi, Olson and Ives, 1986).

1.5.11. Questionnaires as Measure for User Satisfaction

The questionnaire seems to be a valid and reliable measure for user

satisfaction (Baroudi and Orlikowski, 1988; Ammenwerth et al., 2003c). To

appreciate the usability of an information system it is important to measure user

satisfaction in addition to its effectiveness and efficiency. There are modular

questionnaires for user satisfaction, such as the Questionnaire for User

Interaction Satisfaction (QUIS), the Software Usability Measurement Inventory

(SUMI) and the System Usability Scale (SUS) (Table 6).

Page 42: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

41

Questionnaires measuring user satisfaction

Measurement Factors measured Comments

SUS

(System Usability Scale)

(1)

User satisfaction with software.

Subjective assessments of usability.

User satisfaction with software (1).

Subjective assessments of usability (1).

10-item scale questionnaire (1).

SUMI

(Software Usability Measurement

Inventory)

Scores the measurement factors

against expected industry standards.

Likeability,

Efficiency,

Helpfulness,

Control and

Learn-ability

50 item questionnaire.

QUIS

(Questionnaire for User Interaction

Satisfaction). (2)

Assessment of user’s subjective satisfaction

combined with attitudes towards human/interface

factors.

Screen factors,

Terminology and system feedback,

Learning factors,

System capabilities,

Technical manuals,

On-line tutorials,

Multimedia,

Voice recognition,

Virtual environments,

Internet access, and

Software installation.

Similar to SUMI.

Demographic questionnaire.

Measurement for overall user satisfaction along six scales.

Table 6: Questionnaires for user satisfaction measurement Sources: (1) Brooke, 1996 (2) Harper and Norman, (3)

Page 43: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

42

2. MATERIALS AND METHODS

2.1. MATERIALS

2.1.1. Kavala Hospital

The peripheral hospital of Kavala is stated at the Kavala city. The

departments of the hospital are the follows:

Health Care Departments Administrative Departments

A' Pathological Microbiological Personnel Department

B' Pathological Haematological Secretarian

A' Pulmonary Radiological Admission Office

B' Pathological Pharmaceutics Financial Management

Cardiological Biochemistry Department of computer science

Neurological Psychiatric Nutritional

Pediatric Warehouse Department

Reumatological

Dermatological

A' Surgical

B' Surgical

Department of Thoracic Surgery

Orthopaedic

Urological

Opthalmological

Anesthesiological

Midwifery and Obstetrics

Neurosurgical

In all the above departments hospital information system is applied except

the Psychiatric department. However, the structure of the information system

is in very early stages. In health care departments it is applied only for drug

ordering. For that reason, users in each of the health care departments are

only the head nurse and the alternate head nurse. In the administrative

department all operations are implemented through the information system of

the hospital. Yet, in these departments many non-users exist, mainly because

of they old and have no IT education.

For the purpose of this study 241 persons working in the hospital were

asked whether they used the HIS or not. Sixty-three of them were occupied at

the administrative department of the hospital and the rest 178 were health

professionals. The percentage of the users that denied taking part to the study

was very small (7,8%).

Page 44: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

43

The overall of the users participated to the study was in total sixty-four.

Forty-three of them were administrative employees (administrative group) and

twenty-one health professionals respectively (health care group).

To the degree of the percentages of the non-users, 27% of the

administrative employees and 88,1% of the health care professionals were

non-users of the HIS.

2.1.2. System Usability Scale

During this study, the System Usability Scale (SUS) questionnaire was

used. This questionnaire is designed only for the users of the system.

Subjects were asked to complete a paper-based form of SUS given to them

hand-to-hand, translated to the Greek language and attached with the

consent form.

The SUS was designed by Brooke (1996) in order to measure the usability

of a system. It is a simple, ten-item high-levelled scale assessing a variety of

aspects of system usability subjectively and is freely available for usability

assessment in a variety of research projects and industrial evaluations

(Brooke, 1996).

One of the main reasons that SUS questionnaire has been selected is that

this questionnaire is simple, flexible and covers many aspects as far as for

usability of system in order to evaluate user satisfaction. It is also worth-

mentioning that this questionnaire has not been used for evaluation of hospital

information systems.

2.1.2.1. Using the System Usability Scale

According to its creator, when using the SUS questionnaire the following

things should be taken into account:

Respondents should not think for a long time what to check. The

immediate response is valuable.

All items should be checked.

If respondents are not sure what to response to a particular item, they

should mark the centre point of the scale.

Page 45: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

44

2.1.2.2. Scoring the System Usability Scale

According to Brooke (1996), the calculation of the SUS score is

demonstrated as follows:

Each item's score contribution will range from 0 to 4.

For items 1,3,5,7,and 9 the score contribution is the scale position minus

1. For items 2,4,6,8 and 10, the contribution is 5 minus the scale

position.

The score contributions from each item are summed.

The above sum is multiplied by 2.5. The result from this multiplication is

the overall value of SUS.

SUS scores have a range of 0 to 100.

Scores of each item separately are pointless on their own.

2.1.3. Consent Form and Information Sheet

According to Helsinki‘s declaration, an information sheet and a consent

form were given to all subjects. The above were written in the Greek language

for the convenience of the respondents. The participants were firstly informed

about the purpose of the study both orally and reading the information sheet.

The participation was limited to that specified in the information sheet. All the

subjects agreed signing the consent form to participate in the filling of the

questionnaire.

As was stated to the consent form, the subjects were free:

to ask questions concerned of the study and the given information

sheet, and

to withdraw at any time, without giving any reason and without their

legal rights being affected, as their participation was voluntary.

Additionally, as far as concerning the consent form, the subjects had the

opportunity either to agree or disagree to take part to the study and allow any

responsible person, relevant to the specific research, to look any of the

information that the subject would transfer.

Page 46: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

45

2.2. METHODS

2.2.1. Data Protection

All the data collected for the study kept in a safe place locked. The records

may be transferred only in E.U. countries. The results may be published in a

Greek or an international journal or conference by all means, without

transgressing the anonymity of subjects. The anonymity was secured by using

index numbers.

2.2.2. Selection Criteria

All the persons asked whether they used the hospital information system or

not. Initially two hundreds forty-one subjects kindly asked to participate to the

study. One hundred and ninety eight subjects excluded from the filling of the

questionnaire, as they were non-users. Therefore, only the users of the

system, either health professionals or administrative staff, filled the

questionnaire.

2.2.3. Statistical Methods

The person responsible for the collection and the statistical evaluation of

the data was Chiotaki Nikomacheia. The statistical package used was SPSS

12.0. The statistical methods used were descriptive statistics, bivariate

correlations and non-parametric tests.

Page 47: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

46

3. DATA ANALYSIS

3.1. SUS Score

SUS questionnaire data also suggests that subjects themselves rated the

system as being fairly usable, with an SU score of 67.305 and SD of 13.46.

The SUS score in the health care group most commonly varied between 61-

80% in the 66.7% of its individuals. Additionally, 14.3% of the health care

users scored the questionnaire between 81 to 100% and another equal

percentage of the same group scored between 41 to 60%. Finally, a small

percentage (4.8%) scored between 21 to 40%. None of them scored between

0-20 percent.

As far as for the administrative group, the majority of its individuals (53.5%)

scored also between 61-80%. The percentage that scored 81-100% is equal

to 11.6%. On the other hand, a many respondents (30.2%) scored between

41-60%. Finally, none of respondents scored 0-20% and 4.7% of the

respondents scored 21-40%.

Page 48: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

47

SUS Score

21-40% 41-60% 61-80% 81-100%

Score

0

10

20

30

40

50

60

70

Figure 1a: SUS score in the health care group

21-40% 41-60% 61-80% 81-100%

Score

0

10

20

30

40

50

60

Figure 1b: SUS score in the administrative group

Page 49: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

48

3.2. Frequency of Use

The statistical evaluation of the data revealed that health professionals

(n=21) would strongly prefer to use the system frequently in the 66,7% of the

cases (Fig. 1a). These results are significant (p<0.01). Nevertheless, it is

interesting that 80% of the health care users would prefer to use the system

frequently. The latter results are also significant (p<0.01), using the Chi-

square statistical test but imprecise because of the wide 95% confidence

interval [95% CI=(0.642,0.978)].

Similar results are observed also for the administrative users (n=43); over

the half of them would strongly desire to use the system regularly. The

administrative group in a percentage equal to 70% tended to the preferable

frequent use (Fig. 1b). These results were also significant (p<0.01) as resulted

from the use of the Chi-square statistical test, but imprecise due to the wide

95% confidence interval.

Page 50: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

49

Frequency of Use

totally disagree 2 3 4 totally agree

use frequency

0

10

20

30

40

50

60

70

Fig. 2a: Frequency of use in health care group

totally disagree 2 3 4 totally agree

use frequency

0

10

20

30

40

50

60

Fig. 2b: Frequency of use in the administrative group

Page 51: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

50

3.3. Complexity

Concerning the complexity of the system, the 42,9% and of the health care

professionals totally disagreed and an additional proportion equal to 23,8%

disagreed that the system was complex. A small part of this group also

characterised the system complex (14.3%). Moreover, a significant

percentage (19%) had no specific opinion (Fig.2a). These results are not

significant when performing test statistics.

Likewise, approximately half of the administrative employees totally

disagreed and in addition the 16,3% of them disagreed that the system was

complex. Only the 10% of the administrative users consider the system as

complex. It is also worth-mentioning that approximately the 30% of them

neither found it complex nor simple (Fig. 2b). These results are significant

(p<0,05), but the confidence interval is very wide.

Nevertheless, it is interesting that the 66,7% of the health care

professionals and the 62.8% of the administrative group in total disagree that

the system is complex.

Page 52: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

51

Complexity

totally disagree 2 3 4 totally agree

complexity

0

10

20

30

40

50

Fig. 3a: Complexity in health care group

totally disagree 2 3 4

complexity

0

10

20

30

40

50

Fig. 3b: Complexity in the administrative group

Page 53: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

52

3.4. Easy to Use

The health related professionals in 38.1% of the group have defined that

the system was no easy to use, as they had thought. In contrast, the 29.4% of

the same group, found the system easy to use. It is worth-mentioning that a

great percentage of health care users (33.3%) found the system neither easy

nor difficult to use (Fig. 3a). According to Chi-square test, these results have

no statistical significance.

Similarly to the previous results, approximately the 30% of the

administrative users thought that the system was easy to use, where on the

other hand, the 41,9% of users disagreed with this statement. However,

around the 30% of this group neither agrees nor disagrees whether the

system was easy to use, as they perceived or not (Fig. 3b). Likewise to the

health care group, these results are not statistically significant.

Page 54: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

53

Easy to Use

totally disagree 2 3 4 totally agree

easy to use

0

10

20

30

40

Fig. 4a: Easy to use in health care group

totally disagree 2 3 4 totally agree

easy to use

0

10

20

30

40

Fig. 4b: Easy to use in the administrative group

Page 55: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

54

3.5. Need for Support from Technician

One in two of the users in the health care group did not believe that the

support from a technician is necessary in order to be confident to use the

system. However, the largest percentage of users of the same group (38.1%)

had no specific opinion on this statement. The remaining 14.3% considered

the support and the guidance from a technician as important (Fig. 4a). These

results in accordance to Chi-square statistical test are insignificant.

The majority of the administrative employees (37.2%) supported strongly

the opinion that technician’s support is pointless. On the other hand, over the

30% of the administrative users stated that the support from a technician is

essential. It is moreover considerable that over the half respondents from the

administrative group judged the support from a technician as unnecessary

(Fig. 4b). However, these results are statistically insignificant.

Page 56: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

55

Need for Support from Technician

totally disagree 2 3 4 totally agree

support from technician

0

10

20

30

40

Fig. 5a: Need for support from technician in health care group

totally disagree 2 3 4 totally agree

support from technician

0

10

20

30

40

Fig. 5b: Need for support from technician in the administrative group

Page 57: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

56

3.6. Functional Integrity

Over the half respondents from the health care group (57.1%) strongly

considered the system as well integrated and a further 28.6% of respondents

simply agreed with this statement. The opposite opinion was reported only in

4.8% of the group (Fig. 5a). These results are statistically significant (p<0.05)

but imprecise. Although, the total proportion of the group supported that the

system was well integrated was more than 85%, driving also to significant

results (p<0.01), the confidence interval is fairly wide [95% CI= (1.007,0.707)]

suggesting poor precision.

The greatest part of the administrative group (32.6%) strongly supported

the system as well integrated. Additionally, over a quarter of the same group

(27.9%) just supported the same opinion. An equal proportion of the

administrative users neither agreed nor disagreed as far as for the good

integration of the system. The opinion that the system was insufficiently

integrated was supported by the 11.6% of the group (Fig. 5b). These results

are also significant (p<0.05) but imprecise. Nonetheless, it is interesting

that over 60% of the respondents of the same group supported in total the fact

that the system was well integrated. The latter results are significant (p<0.01)

using the Chi-square statistical test but still imprecise because of the wide

95% confidence interval [95% CI=(0.459,0.75)].

Page 58: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

57

Functional Integrity

2 3 4 totally agree

functions' integrity

0

10

20

30

40

50

60

Fig. 6a: Functional integrity in health care group

totally disagree 2 3 4 totally agree

functions' integrity

0

10

20

30

40

Fig. 6b: Functional integrity in the administrative group

Page 59: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

58

3.7. Inconsistency

Regarding the inconsistency in the system, over the half health care users

(52.4%) totally disagreed. The second most common response was neither

negative nor positive in 23.8% of the individuals in the same group. Finally,

less than 10% of the health care participants regarded the system as

inconsistent (Fig. 6a). These results were statistically significant but with poor

precision. Even though over the 66.7% of the health related users supported

that the system was consistent sequencing to significant results (p<0.05),

these results are also imprecise [95% CI=(0.465,0.869)].

The majority of the administrative users (58.1%) believed that the system

was consistent. On the other hand, 18.6% of of the respondents in the same

group supported that the system was inconstant. Finally, a great proportion of

the same group (23.3%) neither support nor decline the specific statement

(Fig. 6b). These results are statistically significant (p<0.05) but with an

inadequate precision [95% CI=(0.434,0.728)].

Page 60: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

59

Inconsistency

totally disagree 2 3 4

inconsistency

0

10

20

30

40

50

60

Fig. 7a: Inconsistency in the health care group

totally disagree 2 3 4 totally agree

inconsistency

0

10

20

30

40

Fig. 7b: Inconsistency in the administrative group

Page 61: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

60

3.8. Quick Learning of the System

Over the 80% of the health related users supported that the system was

easy to be learned. The proportion that supported that the system was difficult

to be learned rated to 14.3% (Fig. 7a). These results have a statistical

significance (p<0.01). However, the confidence interval is fairly wide [95%

CI=(0.693,0.927)], suggesting poor precision.

In the administrative group, the majority (65.2%) supported that the system

was easy to be learned. On the other hand, the 16.3% of the respondents in

the same group supported the opposite opinion. Finally, the percentage that

neither agrees nor disagrees with this statement was considerable

(18.6%)(Fig. 7b). The results concerning the level of easiness of the system to

be learned are statistically significant (p<0.01) but imprecise due to the wide

confidence interval [95% CI=(0.509,0.793)].

Page 62: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

61

Quick Learning of the System

totally disagree 2 3 4 totally agree

quick system's learning

0

10

20

30

40

50

60

Fig. 8a: Quick system’s learning in the health care group

totally disagree 2 3 4 totally agree

quick system's learning

0

10

20

30

40

Fig. 8b: Quick system’s learning in the administrative group

Page 63: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

62

3.9. Cumbersome to Use

In the health care group the responses were fluctuated. The 42.9% of the

health related participants claimed that the system is convenient to use where

the 38.1% of the individuals in the same group asserted that the system was

cumbersome to use. The remaining 19% considered neither the first opinion

nor the second (Fig. 8a). These results are statistically insignificant.

In the administrative group, the 65.1% of the participants supported the

system as manageable to use. On the other hand, the 20.9% of the

respondents in the same group agreed that the system is cumbersome to use

(Fig. 8b). These results are statistically significant (p<0.01) using the Chi-

square statistical test but imprecise due to the wide 95% confidence interval

[95% CI=(0.509,0.793)].

Page 64: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

63

Cumbersome to Use

totally disagree 2 3 4 totally agree

cumbersome to use

0

10

20

30

40

Fig. 9a: Cumbersome to use in the health care group

totally disagree 2 3 4 totally agree

cumbersome to use

0

10

20

30

40

50

Fig. 9b: Cumbersome to use in the administrative group

Page 65: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

64

3.10. Confidence

The majority of the health care participants (76.2%) felt confident using the

system and only 9.6% of the respondents in the same group insecure using

the system (Fig. 9a). Using the Chi-square test, these results proved to be

statistically significant (p<0.01) but with low precision due to the wide

confidence interval [95% CI=(0.580,0944)].

Similarly, most of the administrative users (69.8%) felt confident using the

system whereas only 9.4% of the users in the same group disagreed with the

previous statement. The remaining percentage (20.8%) felt neither confident

nor insecure using the system (Fig. 9b). These results are also statistically

significant (p<0.01) but imprecise as well [95% CI=(0.561,0.835)].

Page 66: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

65

Confidence

totally disagree 2 3 4 totally agree

confidence

0

10

20

30

40

50

Fig. 10a: Confidence in the health care group

totally disagree 2 3 4 totally agree

confidence

0

10

20

30

40

Fig. 10b: Confidence in the administrative group

Page 67: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

66

3.11. Need to Learn Before the Use of the System

Almost the half of the health care respondents (47.6%) strongly supported

that there was no need to be trained before the use of the system.

Particularly, 61.9% of the users in the health care group supported that there

was no need to learn a lot of things before the use of the system. On the other

hand, 19% of the participants in the same group supported that the training

was necessary, whereas the remaining percentage neither supported nor

rejected the necessity of learning things before the use of the system (Fig.

10a). These results are proved significant using the Chi-square test (p<0.05)

but the very wide confidence interval [95% CI=(0.411,0.826)] suggests very

poor precision.

Although over the half of the administrative users supported that there was

no need for learning things before the use of the system, a significant

proportion of the same group (34.9%) claimed that the training before the use

of the system is important (Fig. 10b). The results are statistically significant

(p<0.05) but the 95% confidence interval is wide enough [95%

CI=(0.362,0.660)], suggesting reduced precision.

Page 68: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

67

Need to Learn Before the Use of the System

totally disagree 2 3 4 totally agree

need to learn before use

0

10

20

30

40

50

Fig. 11a: Need to learn before the use of the system in the health care group

totally disagree 2 3 4 totally agree

need to learn before use

0

5

10

15

20

25

30

Fig. 11b: Need to learn before the use of the system in the administrative group

Page 69: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

68

3.12. Correlations

Using Pearson’s Correlation, significant correlations were established at

the 0.05 or 0.01 level in both health care and administrative groups.

As far as in the health care group, confidence and functional integrity have

a significant correlation at the 0.01 level where r=0.668. Functional integrity is

also significantly negatively correlated to inconsistency at the 0.05 level where

r=-0.543. Frequency of use and integrity are correlated with r=0.517 and

p<0.05. The frequency of use is also correlated to confidence where r=0.660

and p<0.01. Another interesting feature is the correlation of cumbersome to

use to technician’s support (r=0.689 and p<0.01) and to inconsistency

(r=0.470 and p<0.05). The complexity is significantly correlated to

inconsistency (r=0.812 and p<0.01), integrity (r=0.490 and p<0.05),

cumbersome to use (r=0.484 and p<0.05) and technician’s support (r=0.546

and p<0.05).

In the administrative group, significant correlations are also demonstrated.

Frequency of use is significantly correlated to confidence (r=0.529 and

p<0.01), as well as to need to learn before the use of the system (r=0.483 and

p<0.01) and integrity (r=0.356 and p<0.05). Frequency of use is also

significantly negatively correlated to complexity (r=-0.321 and p<0.05).

Inconsistency and easy to use correlation is significant at the 0.01 level with

r=0.406. Inconsistency and technician’s support correlation is also significant

with r=0.473 and p<0.01. A significant negative correlation is demonstrated

among cumbersome to use and integrity (r=-0.384 and p<0.05). Integrity is

also significantly correlated to confidence (r=0.469 and p<0.01). The negative

correlations among complexity and confidence (r=-0.410 and p<0.01) and also

among complexity and integrity (r=-0.371 and p<0.05) are significant.

Complexity is also significantly correlated to cumbersome to use (r=0.478 and

p<0.01) and inconsistency (r=0.330 and p<0.05). Finally, the integrity is

negatively correlated to inconsistency (r=-0.304 and p<0.05) and cumbersome

to use (r=-0.323 and p<0.05).

Page 70: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

69

4. Discussion

4.1. General Considerations

Kavala’s Hospital information system is function-limited and is applied by a

small percentage of health care professionals. At the administrative

departments of the hospital, the system was applied by the majority of the

employees. As a result, the percentage of the non-users administrative

employees was high and the percentage of the non-users health care

professionals was extremely high.

The percentage of the persons that denied participating in the study was

very small. That probably happened due to the easiness of the SUS to be

filled and its small size. By this way, the SUS firstly contributes to participant’s

convenience and secondly reduces the risk of the small sample size caused

by participants’ rejection to fill in the questionnaire. Consequently, the SUS

proved a successful questionnaire for the evaluation of user satisfaction in

hospitals, not only because of its flexibility, but also because of the wide range

of aspects that covers.

The statistical evaluation of the data revealed that the majority of both

administrative and health care users would like to use the system frequently.

As far as for the perceived ease to use, the opinions of both health care

professionals and administrative employees were fluctuated presenting

however similarities among the two groups.

Health care users are seemed to believe that the system is easy to be

learned and so, there is no need to get extra knowledge before its use. They

also considered that the system is well integrated and consistent providing

them confidence and desire for frequent use.

On the other hand, administrative users disagreed that the system is

cumbersome and complex. In contrast, they are appeared to believe that it is

well integrated, and consistent, providing them confidence for frequent use. It

was also easy to learn it without requiring a lot of knowledge prior to use.

4.2. Correlations

The findings of this study for the health care group suggested that a well-

integrated system makes confident users. Therefore they could handle it more

frequently. The integrity affects significantly the inconsistency and vice versa.

Page 71: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

70

The same bilateral relation is applied to inconsistency, which is significantly

affected by complexity. When an information system is not well integrated

then becomes inconsistent and complex and furthermore cumbersome for the

user. Due to complexity, which results to a confusing for the user system,

support from the technician becomes essential. Therefore, a correlation

pathway among significant aspects, affecting the successful adoption of the

system may be hypothesised as far as for user satisfaction consideration, in

which all the above are represented.

In the administrative group, interesting interrelations are also

demonstrated. A well-integrated information system makes its user confident

to handle the system frequently. But a non well-integrated system makes the

user impatient and unwilling to use it regularly. The complex system is

cumbersome to use. A well-integrated information system is also consistent

and less complex. Thus, according to the findings of the study the negative

and bilateral relationship among complexity and integrity is very strong. A

complex system is not an integrated system and vice versa. So, a complex

system creates a field of insecurity to the users and therefore becomes

cumbersome and inconsistent for them. The inconsistent system emphasises

the need for support from a technician as vital. All the above are

demonstrated at the following scheme.

Inconsistency

Cumbersome

Complexity

Integrity

Use Frequency

Confidence

Technician’s

Support

Page 72: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

71

User satisfaction depends on the way that the system facilitates the user

(Dupuits and Hasman, 1995). For the successful adoption of the system and

therefore for user satisfaction it is also necessary, the system to be integrated.

By this way the users would be confident to use the system frequently. A non-

well integrated system is complex and inconsistent. The triangles among

Confidence- Integrity- Frequency to use (CIF triangle) and Integrity-

Complexity- Inconsistency (ICI triangle) found to have significant value in both

groups. Therefore, there is a strong indication for their adoption in evaluation

studies.

4.3. Conclusion

The development of information system technology in a hospital

environment is in early stages and still under consideration by the local

authorities. And even this milestone is overcome, a great emphasis should be

given in the level of training of all health-related and administrative

professionals. The fact that most of the health professionals are not users of

Inconsistency

Cumbersome

Complexity Integrity

Use Frequency

Confidence

Technician’s

Support

Page 73: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

72

this technology creates a field of confusion affecting mainly the persons

seeking medical health care. For the successful adoption of an innovative

HIS, evaluation studies should include tests for user satisfaction. The CIF and

ICI triangles could be important concepts that should be used for the

assessment of user satisfaction and therefore of hospital information systems.

Further research utilising a larger selection of both health-related

professionals and administrative employees would be recommended for future

trials. The SUS questionnaire used in this study was proved very useful for

this purpose. However, the need for a more holistic approach in terms of the

hospital information systems sphere of evaluation is required.

Page 74: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

73

REFERENCES

1. Aarts, J. and Berg, M. 2004. ‘A tale of two hospitals: a sociotechnical

appraisal of the introduction of computerized physician order entry in two

Dutch hospitals’. Medinfo, 11(2), pp.999-1002.

2. Ahearn, M.D. and Kerr, S.J. 2003. ‘General practitioners' perceptions of

the pharmaceutical decision-support tools in their prescribing software’.

Medical Journal of Australia, 179(1), pp.34-37.

3. Ammenwerth, E., Mansmann, U., Iller, C. and Eichstädter, R. 2003a.

‘Factors affecting and affected by user acceptance of computer-based

nursing documentation: results of a two-year study’. Journal of the

American Medical Informatics Association, 10, pp.69-84.

4. Ammenswerth, E., Gräber, S., Herrmann, G., Bürkle, T., König, J. 2003b.

‘Evaluation of health information systems-problems and challenges’.

International Journal of Medical Informatics, 71(2-3), pp.125-135.

5. Ammenwerth, E., Kaiser, F., Wilhelmy, I. and Hofer, S. 2003c.

‘Evaluation of user acceptance of information systems in health care: the

value of questionnaires’. Studies in Health Technology and Informatics,

95, pp.642-648.

6. Ammenwerth, E., Brender, J., Nykänen, P., Prokosch, H., Rigby, M. and

Talmon, J. 2004. Visions and strategies to improve evaluation of health

information systems. Reflections and lessons based on the HIS-EVAL

workshop in Innsbruck. International Journal of Medical Informatics,

73(6), pp.479-491.

7. Anderson, JG. 2004. Information technology for detecting medication

errors and adverse drug events. Expert Opinion on Drug Safety, 3(5),

pp.449-455.

8. Ash, J. 1997. Organizational factors that influence information

technology diffusion in academic health sciences centers. Journal of the

American Medical Association, 4(2), pp.102-109.

9. Ash, J. and Berg, M. 2003. ‘Report of conference Track 4: socio-

technical issues of HIS’. International Journal of Medical Informatics,

69(2-3), pp.305-306.

10. Ash, JS., Sittig, DF., Seshadri, V., Dykstra, RH., Carpenter, JD. and

Stavri, PZ. 2005. ‘Adding insight: A qualitative cross-site study of

Page 75: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

74

physician order entry’. International Journal of Medical Informatics, 74(7-

8), pp.623-628.

11. Aydin, C.E. 1994. ‘Survey methods for assessing social impact of

computers in health care organizations’, In: Anderson, J.G., Aydin, C.E.

and Jay, S.J. (Eds.), Evaluating health care information systems:

methods and applications, Sage Publications, Thousand Oaks, pp.69-

115.

12. Bailey, J.E. and Pearson, S. 1983. ‘Development of a Tool for Measuring

and Analyzing Computer User Satisfaction’. Management Science,

29(5), pp.530-545.

13. Bailey, J.E. 1990. ‘Development of an instrument for the management of

computer user attitudes in hospitals’. Methods of Information in Medicine,

29(1), pp.51-56.

14. Bajaj, A. and Nidumolu, S.R. 1998. ‘A feedback model to understand

information system usage’, Information & Management, 33(4), pp.213-

224.

15. Bakker, A.R. 2003. ‘Views on HIS development; recommendations of

earlier working conferences compared with present challenges’.

International Journal of Medical Informatics, 69(2-3), pp.91-97.

16. Ball, MJ. 2003. ‘Hospital information systems: perspectives on problems

and prospects, 1979 and 2002’. International Journal of Medical

Informatics, 69(2-3), pp.83-89.

17. Barbour, R.S. 1999. ‘The case for combining qualitative and quantitative

approaches in health services research’. Journal of Health Services

Research & Policy, 4(1), pp.39-43.

18. Barki, H. and Hartwick, J. 1994. ‘Measuring user participation, user

involvement, and user attitude’. MIS Quarterly, 18(1), pp.59-82.

19. Baroudi, J.J., Olson, M.H. and Ives, B. 1986. ‘An empirical study of the

impact of user involvement on system usage and information

satisfaction’, Communications of the ACM, 29(3), pp.232-238.

20. Baroudi, J.J. and Orlikowski, W.J. 1988. ‘A Short-From Measure of User

Information Satisfaction: A Psychometric Evaluation and Notes on Use’.

Journal of Management Information Systems, 4(4), pp.44-59.

Page 76: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

75

21. Berg, M. 1999. ‘Patient care information systems and health care work: a

sociotechnical approach’. International Journal of Medical Informatics,

55(2), pp.87-101.

22. Beuscart-Zephir, M.C., Brender, J., Beuscart, R. and Menager-

Depriester, I. 1997. ‘Cognitive evaluation: how to assess the usability of

information technology in healthcare’. Computer Methods and Programs

in Biomedicine, 54(1-2), pp.19-28.

23. Beynon-Davies, P. and Lloyd-Williams, M. 1999. ‘When health

information systems fail’. Topics in Health Information Management,

20(1), pp.66-79.

24. Bindels, R., Hasman, A., Derickx, M., Van Wersch, J.W. and Winkens,

R.A. 2003. ‘User satisfaction with a real-time automated feedback

system for general practitioners: a quantitative and qualitative study’.

International Journal for Quality in Health Care, 15, pp.501-508.

25. Blobel, B. and Holena, M. 1998. ‘CORBA security services for health

information systems’, International Journal of Medical Informatics, 52,

pp.29-37.

26. Bose, R. 2003. ‘Knowledge management-enabled health care

management systems: capabilities, infrastructure, and decision-support’.

Expert Systems with Applications, 24(1), pp.59-71.

27. Brigl, B., Ammenwerth, E., Dujat, C., Gräber, S., Groβe, A., Häber, A.,

Jostes, C. and Winter, A. 2005. ‘Preparing strategic information

management plans for hospitals: a practical guideline SIM plans for

hospitals: a guideline’. International Journal of Medical Informatics, 74(1),

pp.51-65.

28. Brooke, J. 1986. ‘SUS: A Quick and Dirty Usability Scale’. Retrieved

August 16, 2005 from

http://www.usability.serco.com/trump/methods/satisfaction.htm

29. Brumini, G., Ković, I., Zombori, D., Lulić, I., Bilic-Zulle, L. and Petrovečki,

M. 2005. ‘Comparisons of physicians’ and nurses’ attitudes towards

computers’. Studies in Health Technology and Informatics, 116, pp.608-

613.

Page 77: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

76

30. Butler, M.A. and Bender, A.D. 1999. ‘Intensive care unit bedside

documentation systems. Realizing cost savings and quality

improvements’. Computers in Nursing, 17(1), pp.32-38.

31. Chang, Z., Mei, S., Gu, Z., Gu, J., Xia, L., Liang, S. and Lin, J. 2003.

‘Realization of integration and working procedure on digital hospital

information system’. Computer Standards & Interfaces, 25, pp.529-537.

32. Chin, H.L. and McClure, P. 1995. ‘Evaluating a comprehensive

outpatient clinical information system: a case study and model for system

evaluation’. Proceedings of the Nineteenth Symposium of Computer

Applications in Medical Care, pp.717-721.

33. Chiu, C., Hsu, M., Sun, S., Lin, T. and Sun, P. 2005. ‘Usability, quality,

value and e-learning continuance decision’. Computers & Education,

45(4), pp.399-416.

34. Ciccarese, P., Caffi, E., Quaglini, S. and Stefanelli, M. 2005.

‘Architectures and tools for innovative health information systems: the

guide project’, International Journal of Medical Informatics, 74(7-8),

pp.553-562.

35. Clarke, K., O'Moore, R., Smeets, R., Talmon, J., Brender, J., McNair, P.,

Nykänen, P., Grimson, J. and Barber, B. 1994. ‘A methodology for

evaluation of knowledge-based systems in medicine’. Artificial

Intelligence in Medicine, 6(2), pp.107-21.

36. Collins COBUILD English language dictionary 1990, 1st ed., Collins

Publishers, London.

37. Currie, L.M. 2005. ‘Evaluation frameworks for nursing informatics’,

International Journal of Medical Informatics [In press].

38. Davis, F.D. 1989. ‘Perceived usefulness, perceived ease of use and user

acceptance of information technology’. MIS Quarterly, 13, pp. 319-340.

39. Davis, F.D., Bagozzi, R.P. and Warshaw, P.R. 1989. ‘User acceptance of

computer technology: A comparison of two theoretical models’.

Management Science, 35(8), pp.982-1003.

40. Davis, F.D. 1993. ‘User acceptance of information technology: system

characteristics, user perceptions and behavioural impacts’. International

Journal of Man-Machine Studies, 38, pp.475-487.

Page 78: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

77

41. Delone, W.H. and McLean, E.R. 1992. ‘Information systems success: the

quest for the dependent variable’. Information Systems Research, 3(1),

pp.60-95.

42. Despont-Gros, C., Mueller, H. and Lovis, C. 2005. ‘Evaluating user

interactions with clinical information systems: A model based on human-

computer interaction methods’. Journal of Biomedical Informatics, 38,

pp.244-255.

43. Dixon, RD. 1999. ‘The behavioral side of information technology’.

International Journal of Medical Informatics, 56, pp. 117-123.

44. Doll, W. and Torkzadeh, G. 1988. ‘The measurement of end-user

computing satisfaction’. MIS Quarterly, 12 (2), pp.259-274.

45. Dupuits, F.M.H.M. and Hasman, A. 1995. ‘User satisfaction of general

practioners with HIOS+, a medical decision support system’. Computer

Methods and Programs in Biomedicine, 47, pp.183-188.

46. Dziuban, SW. 1999. ‘Using database information in your clinical

practice’. Annals of Thoracic Surgery, 68, pp.350-352.

47. Edwards, M. and Moczygemba, J. 2004. ‘Reducing medical errors

through better documentation’. The Health Care Manager, 23(4), pp.329-

333.

48. Feied, CF., Handler, JA., Smith, MS., Gillam, M., Kanhouwa, M.,

Rothenhaus, MS., Conover, K. and Shannon, T. 2004. ‘Clinical

information systems: instant ubiquitous clinical data for error reduction

and improved clinical outcomes’. Academic Emergency Medicine,

11(11), pp.1161-1169.

49. Forsythe, D.E. and Buchanan, B.G., 1991. ‘Broadening our approach to

evaluating medical information systems’. Proceedings- The Annual

Symposium on Computer Applications in Medical Care, pp.8-12.

50. Friedman, C.P. and Abbas, U.L. 2003. ‘Is medical informatics a mature

science? A review of measurement practice in outcome studies of clinical

systems’. International Journal of Medical Informatics, 69(2-3), pp.261-

272.

51. Friedman, C.F. and Haug, P. 2003. ‘Report on conference track 5:

evaluation metrics and outcome’. International Journal of Medical

Informatics, 69(2-3), pp.307-309.

Page 79: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

78

52. Gadd, CS., Baskaran, P. and Lobach, D.F. 1998. ‘Identification of design

features to enhance utilization and acceptance of systems for Internet-

based decision support at the point of care’. Proceedings of AMIA

Annual Fall Symposium, Philadelphia: Hanley & Belfus, Inc. pp.91-95.

53. Gadd, C.S. and Penrod, L.E. 2001. ‘Assessing physician attitudes

regarding use of an outpatient EMR: a longitudinal, multi-practice study’.

Proceedings of AMIA Symposium, pp.194-198.

54. Gardner, R.M. and Lundsgaarde, H.P. 1994. ‚Evaluation of user

acceptance of a clinical expert system’. Journal of the American Medical

Informatics Association, 1(6), pp.428-438.

55. Gefen, D. and Ridings, C.M. (2003), IT acceptance: managing user—IT

group boundaries. ACM SIGMIS Database, 34(3), p.25-40. Retrieved

September 16, 2005, from the ACM Portal.

56. Gell, G., Madjaric, M., Leodolter, W., Köle W. and Leitner H. 2000. ‘HIS

purchase projects in public hospitals of Styria, Austria’. International

Journal of Medical Informatics, 58-59, pp.147-155.

57. Giuse, D.A. and Kuhn, K.A. 2003. ‘Health information systems

challenges: the Heidelberg conference and the future’. International

Journal of Medical Informatics, 69(2-3), pp.105-114.

58. Grémy, F., and Degoulet, P. 1993. ‘Dimensions of technology

assessment include techniques, medical and health efficacy, economics,

sociology, and law and ethics’. Medical Informatics, 18(3), pp.185-193.

59. Grémy, F. 2005. ‘Hardware, Software, Peopleware, Subjectivity: A

Philosophical Promenade’. Methods of Information in

Medicine, 44, pp.352-358.

60. Grover, V., Jeong, S. and Segars, A. 1996. ‘Information systems

effectiveness: the construct space and patterns of application’.

Information & Management, 31, pp.177-191.

61. Guptill, J. 2005. ‘Knowledge management in health care’. Journal of

Health Care Finance, 31(3), pp.10-14.

62. Harper, B.D. and Norman, K.L. Improving user satisfaction: the

questionnaire for user interaction satisfaction version 5.5. Retrieved

August 18, 2005, from

http://lap.umd.edu/quis/publications/harper1993.pdf

Page 80: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

79

63. Harteloh, P.P. 2003. ‚Quality systems in health care: a sociotechnical

approach’. Health Policy, 64(3), pp.391-398.

64. Heeks, R. 2005a. ‘Health information systems- past, present, future’.

International Journal of Medical Informatics, [In press].

65. Heeks, R. 2005b. ‘Health information systems: failure, success and

improvisation’. International Journal of Medical Informatics, [In press].

66. Jaspers, M.W.M., Ammenwerth, E., Ter Burg, W.J.P.P., Kaiser, F. and

Haux, R. 2004. ‘An international course on strategic information

management for medical informatics students: International perspectives

and evaluation’. International Journal of Medical Informatics, 73(11-12),

pp.807-815.

67. Johnson, K.B., Ravich, W.J. and Cowan J.A. 2004. ‘Brainstorming about

next-generation computer-based documentation: an AMIA clinical

working group survey’. International Journal of Medical Informatics,

73(11-12), pp.665-674.

68. Johnson, W. 2005. ‘The planning cycle’. The Journal of Healthcare

Information Management, 19(3), pp. 56-64.

69. Igbaria, M. and Guimaraes, T. 1994. ‘Empirically testing the outcomes of

user involvement in DSS development’. Omega, 22(2), pp.157-172.

70. Kalogeropoulos, D.A., Carson, E.R. and Collinson P.O. 2003. ‘Towards

knowledge-based systems in clinical practice: Development of an

integrated clinical information and knowledge management support

system’. Computer Methods and Programs in Biomedicine, 72(1), pp.65-

80.

71. Kaplan, B. 1997. ‘Addressing Organizational Issues into the Evaluation

of Medical System’. Journal of the American Medical Informatics

Association, 4(2), pp. 94–101.

72. Kaplan, B. 2001. ‘Evaluating informatics applications—some alternative

approaches: theory, social interactionism, and call for methodological

pluralism’. International Journal of Medical Informatics, 64(1), pp.39-56.

73. Karlsson, D., Ekdahl, C., Wigertz, O., and Forsum, U. 1997. ‘A

Qualitative study of clinicians ways of using a decision-support system’.

Journal of the American Medical Informatics Association, 4, pp.268-272.

Page 81: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

80

74. Katehakis, D.G., Kostomanolakis, S., Tsiknakis, M. and Orphanoudakis,

S.C. 2001. ‘An open, component-based information infrastructure to

support integrated regional healthcare networks’. Medinfo, 10(1), pp.18-

22.

75. Kaushal, R. and Bates, D.W. 2002. ‘Information technology and

medication safety: what is the benefit?’. Quality & Safety in Health Care,

11, pp. 261-265.

76. Kaushal, R., Shojania, K.G. and Bates, D.W. 2003. ‘Effects of

computerized physician order entry and clinical decision support systems

on medication safety: A systematic review’. Archives of Internal

Medicine, 163, pp. 1409-1416.

77. Kujala, S. 2003. ‘User involvement: a review of the benefits and

challenges’. Behaviour and Information Technology, 22(1), pp.1-16.

78. Kuperman, G.J. and Gibson, R.F. 2003. ‘Computer physician order entry:

benefits, costs, and issues’. Annals of Internal Medicine, 139, pp. 31-39.

79. Lee, T. 2004. ‘Evaluation of computerized nursing care plan: Instrument

development’. Journal of Professional Nursing, 20(4), pp.230-238.

80. Legris, P., Ingham, J. and Collerette, P. 2003. ‘Why do people use

information technology?’. A critical review of the technology acceptance

model. Information & Management, 40(3), pp.191-204.

81. Li, E.Y. 1997. ‘Perceived importance of information system success

factors: A meta analysis of group differences’. Information &

Management, 32, pp.15-28.

82. Li, L., Benton, WC. 2005. ‘Hospital technology and nurse staffing

management decisions’. Journal of Operations Management, [In press].

83. Liberman, N. and Förster, J. 2005. ‘Inferences from decision

difficulty’. Journal of Experimental Social Psychology, [In press].

84. Littlejohns, P., Wyatt, J.C. and Garvican, L. 2003. ‘Evaluating

computerised health information systems: hard lessons still to be learnt’.

BMJ, 326, pp.860-863.

85. Loef, C. and Truyen, R. 2005. ‘Evidence and diagnostic reporting in the

IHE context’. Academic Radiology, 12(5), pp.620-625.

Page 82: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

81

86. Lu, Y., Xiao, Y., Sears, A. and Jacko, J. 2005. ‘A review and a framework

of handheld computer adoption in healthcare’. International Journal of

Medical Informatics, 74, pp. 409-422.

87. Mahmood, M.A., Burn, J.M., Gemoets, L.A. and Jacquez, C. 2000.

‘Variables affecting information technology end-user satisfaction: a meta-

analysis of the empirical literature’. International Journal of Human-

Computer Studies, 52, pp.751-771.

88. Marx, D.A. and Slonim, A.D. 2003. ‘Assessing patient safety risk before

the injury occurs: an introduction to sociotechnical probabilistic risk

modelling in health care’. Quality & Safety in Health Care, 12(Suppl 2),

pp. 33-38.

89. Maybloom, B. and Champion, Z. 2003. ‘Development and

implementation of a multicentre information system for paediatric and

infant critical care’. Intensive and Critical Care Nursing, 19(6), pp.326-

341.

90. McKeen, J.D. and Guimaraes, T. 1997. ‘Successful strategies for user

participation in systems development’. Journal of Management

Information Systems, 14(2), 133-150.

91. Mikulich, V.J., Liu, Y.C., Steinfeldt, J. and Schriger, D.L. 2001.

‘Implementation of clinical guidelines through an electronic medical

record: physician usage, satisfaction and assessment’. International

Journal of Medical Informatics, 63(3), pp.169-178.

92. Miller, RA. 1996. ‘Evaluating evaluations of medical diagnostic systems’.

Journal of the American Medical Informatics Association, 3(6), pp.429-

431.

93. Moehr, J.R. 2002. ‚Evaluation: salvation or nemesis of medical

informatics’. Computers in Biology and Medicine, 32, pp.113-125.

94. Monteiro, E. 2003), ‘Integrating health information systems: a critical

appraisal’. Methods of Information in Medicine, 42, pp.428-432.

95. Nykänen, P., John Enning, J., Talmon, J., Hoyer, D., Sanz, F., Thayer,

C., Roine, R., Vissers, M. and Eurlings, F. 1999. ‘Inventory of validation

approaches in selected health telematics projects’. International Journal

of Medical Informatics, 56(1-3), pp.87-96.

Page 83: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

82

96. Odhiambo-Otieno, G.W. 2005. ‘Evaluation of existing District Health

Management Information Systems: A case study of the District Health

Systems in Kenya’. International Journal of Medical Informatics, 74(9),

pp.733-744.

97. Pietka, E. 2003. ‘Large-scale hospital information system in clinical

practice’. International Congress Series, 1256, pp.843-848.

98. Reddy, M., Pratt, W., Dourish, P. and Shabot, M.M. 2003.

‘Sociotechnical requirements analysis for clinical systems’. Methods of

Information in Medicine, 42(4), pp.437-444.

99. Rigby, M. 2001. ‘Evaluation: 16 powerful reasons why not to do it-and 6

over-riding imperatives’. Medinfo, 10(Pt 2), pp.1198-1202.

100. Rothschild, J. 2004. ‘Computerized physician order entry in the critical

care and general inpatient setting: a narrative review. Journal of Critical

Care, 19 (4), pp. 271-278.

101. Ruland, CM. and Bakken, S. 2001. ‘Representing patient preference-

related concepts for inclusion in electronic health records’. Journal of

Biomedical Informatics, 34, pp.415-422.

102. Ruland, CM. 2004. ‘Improving patient safety through informatics tools for

shared decision making and risk communication’. International Journal of

Medical Informatics, 73(7-8), pp.551-557.

103. Ruotsalainen, P. 2004. ‘A cross-platform model for electronic health

record communication’. International Journal of Medical Informatics,

73(3), pp.291-295.

104. Simpson, RL. 2004. ‘Managing the three ‘P’s to improve patient safety:

nursing administration’s role in managing information technology’.

International Journal of Medical Informatics, 73(7-8), pp.559-561.

105. Southon, F.C.G., Sauer, C. and Crant, C.N. 1997. ‘Information

technology in complex health services: organizational impediments to

successful technology transfer and diffusion’. Journal of the American

Medical Informatics Association, 4, pp.112-124.

106. Southon, G., Sauer, C. and Dampney, K. 1999. ‘Lessons from a failed

information systems initiative: issues for complex organisations’.

International Journal of Medical Informatics, 55(1), pp.33-46.

Page 84: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

83

107. Staccini, P., Joubert, M., Quaranta, J. and Fieschi, M. 2005. Mapping

care processes within a hospital: from theory to a web-based proposal

merging enterprise modelling and ISO normative principles. International

Journal of Medical Informatics, 74(2-4), pp.335-344.

108. Staples, D.S., Wong, I. and Seddon, P.B. 2002. ‘Having expectations of

information systems benefits that match received benefits: does it really

matter? Information & Management, 40(2), pp.115-131.

109. Szajna, B. and Scamell, R. 1993. ‘The effects of information system user

expectations on their performance and perceptions’. MIS Quarterly,

17(4), pp.493-516.

110. Takeda, H., Matsumura, Y., Kuwata, S., Nakano, H., Sakamoto, N. and

Yamamoto, R. 2000. ‘Architecture for networked electronic patient record

systems’. International Journal of Medical Informatics, 60(2), pp.161-167.

111. Talmon, J., Enning, J., Castaneda, G., Eurlings, F., Hoyer, D., Nykänen,

P., Sanz, F., Thayer, C. and Vissers, M. 1999. ‘The VATAM guidelines’.

International Journal of Medical Informatics, 56(1-3), pp.107-115.

112. Taylor, PP. 2005. ‘Use of handheld devices in critical care’. Critical Care

Nursing Clinics of North America, 17(1), pp. 45-50.

113. Travers, D.A. and Downs, S.M. 2000). ‘Comparing user acceptance of a

computer system in two pediatric offices: a qualitative study’.

Proceedings of AMIA Symposium, pp.853-857.

114. Tsiknakis, M., Katehakis, D. and Orphanoudakis, S.C. 2004 ‘A health

information infrastructure enabling secure access to the life-long

multimedia electronic health record’. International Congress Series,

1268, pp.289-264.

115. van Merode, GG, Groothuis, S. and Hasman, A. 2004. ‘Enterprise

resource planning for hospitals’, International Journal of Medical

Informatics, 73(6), pp.493-501.

116. Venkatesh, V., Morris, M.G., Davis, G.B. and Davis, F.D. 2003. ‘User

acceptance of information technology: Towards a unified view’. MIS

Quarterly, 27, pp.425-478.

117. Vlahos, G.E., and Ferratt, T.W. 1999. ‘Information technology use by

managers in Greece to support decision making: amount, perceived

value, and satisfaction’. Information & Management, 29, pp.305-315.

Page 85: EVALUATION OF HOSPITAL INFORMATION SYSTEMSdigilib.teiemt.gr/jspui/bitstream/123456789/3444/1/03DSSZ01Z0011.pdf · evaluation of hospital information systems is difficult due to the

84

118. Warburton, RN. 2005. ‘Patient safety — how much is enough?’. Health

Policy, 71(2), pp.223-232.

119. Winter, A., Brigl, B. and Wendt, T. 2003. ‘Modelling hospital information

systems (part 1): the revised three-layer graph-based meta model

3LGM2’. Methods of Information in Medicine, 42(5), pp.544-551.

120. Wu, J. and Wang, Y. 2005. ‚Measuring ERP success: The key-users’

viewpoint of the ERP to produce a viable IS in the organization’.

Computers in Human Behavior, [In press].

121. Wyatt, J. and Spiegelhalter, D. 1990. ‘Evaluating medical expert

systems: what to test and how?’. Medical Informatics, 15, pp.205-217.

122. Wyatt, J.C. and Wyatt, S.M. 2003. ‘When and how to evaluate health

information systems’. International Journal of Medical Informatics, 69(2-

3), pp.251-259.

123. Yang, H. and Yoo, Y. 2004. ‘It's all about attitude: revisiting the

technology acceptance model’. Decision Support Systems, 38(1), pp.19-

31.

124. Zain, M., Che Rose, R., Abdullah, I. and Masrom, M. 2005. ‘The

relationship between information technology acceptance and

organizational agility in Malaysia’. Information & Management, 42(6),

pp.829-839.

125. Zielstorff, R.D., Estey, G., Vickery, A., Hamilton, G., Fitzmaurice, J.B.

and Barnett, G.O. 1997. ‘Evaluation of a decision support system for

pressure ulcer prevention and management: preliminary findings’.

Proceedings of AMIA Symposium, pp.248-252.