measuring the performance of information systems: a functional scorecard

Upload: md-ariful-islam

Post on 03-Jun-2018

219 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    1/32

    Measuring the PerformanceoInformation SystemsA Functional ScorecardJERRY CHA -JAN CHANG AND WILLIAM R. KINGJERRY CHA-JAN CHANGisan Assistant Professor inthe D epartmentofM ISin theCollegeofBusiness, UniversityofNevada, Las Vegas. He hasaB.S.inOceanogra-phy from National Ocean University, Taiwan,anM.S.in Computer Science fromCentral Michigan University,anMBA from Texas A&M University, and an M .S.inMoISand aPh.D.inMIS from theUniversityofPittsburgh.His research interestincludes performance measurement,ISstrategy, managem entof IS, group supportsystems, human-computer interaction, organizational learning,andstrategic plan-ning. His work has appearedinInformation Ma nagem ent, Decision Support Sys-tems,DATABASE ommunicationsof the ACM ,andJournal of omputerInformationSystems, and several major IS conference proceedings.WILLIAM R. KINGholds the title University Professor in the Katz Graduate School ofBusiness at the University of Pittsburgh. He has published more than 300 papers and15 boo ksin theareasofInformafion Systems, M anagement Science,andStrategicPlanning.He hasservedasFounding Presidentofthe Associationfor InformafionSystems (AIS), Presidentof TIMS (now INFORMS),andEditor-in-Chief ofMISQuarterly.He was instrumentalinthe creationofINFORM S andoftbeInformationSystems ResearchpumdX.H e recently received the Leo Lifetime Excepfional Achieve-ment Award by A IS.ABSTRACT: This study developsan instrument that maybeusedas aninformationsystems (IS) functional scorecard (ISFS).It isbasedon a theoretical input-outputmodel ofth IS funcfion's role in supporting business p rocess effecfiveness and orga-nizafional performance. Tbe research model consistsof three system output dimen-sions systems perfonnance, informafion effectiveness, and service performance. The updated paradigm for instrument development was followedtodevelop and vali-date the ISFS instrum ent. Construct validafion ofth instrument was conducted usingresponses from 346 systems usersin149 organizafions byacombinafionofexplor-atory factor analysisandstructural equation modeling using LISR EL. The processresultedin an instrument that measures 18unidim ensional factors w ithinthethreeISFS dimensions. Moreover,asampleof120 matched-paired responsesofseparateCIO and user responses was used for nomological validation. The results showed thatthe ISFS measure reflected by the instrument was positively related to improvementsin business processes effectiveness andorganizafional performance. Consequently,the instrument maybeusedforassessingISperformance,for guiding informationtechnology investment and sourcing decisions, and as a basis for further research and

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    2/32

    86 JERRY CHA-JAN CHA NG AND WIL LIAM R. KING

    ASSESSING THE INFORMATION SYSTEM (IS) function's performance has long been animportant issue to IS executives. This interest is evident from the prominence of thisissue in the various IS issue studies [12, 13, 3 4, 49 ,7 2] as well as the popularity ofannual publications such asCom puterWorld Premier 100andInformationWeek 500which involve the use of surrogate metrics to assess overall IS functional perfor-mance (ISFP). Executives routinely seek evidence of returns on information technol-ogy (IT) investments and soureing decisionsboth types of choices that have becom emore substantial and a competitive necessity. As the unit that has major responsibili-ties for these decisions, the IS function is usually believed to be an integral part ofachieving organizational success. Yet the overall performance of the IS function hasproved to be difficult to conceptualize and to measure.

    As the outsourcing ofISsubfunctional areas such as data centers and help deskshas grown into the outsourcing of the entire IS function, there is an ever-growingneed for formal performance assessment [61]. This will permit the establishment ofbaseline measures to useinjudg ing o utsourcing su ccess. So, the issue ofanoverall ISfunctional metric, which is, and has been, high on IS execu tives' priorities, is becom -ing even more im portant.

    Although there has been a good deal of research on IS efficiency, effectiveness, andsuccess at various levels of analysis, overall functional-level performance is one ofthe least discussed and studied. According to Seddon et al. [88], only 24 out of 186studies between 1988 and 1996 can be classified as focusing on the IS functionallevel. Nelson and C oopride r's [70] work epitomizes this need.

    Moreover, while there exist metrics and instruments to assess specific IS subfunctionsand specific IS subareas , such as data center performance, productivity and data qual-ity, typically these measu res canno t be aggregated in any meaningful way. Th is limitstheir usefulness as the bases for identifying the sources of overall performance im-provements or degradations. As an anonymous reviewer of an earlier version of thispaper said, The critical issue is that the performance of the IS function is now underthe microscope and decisions to insource/outsource and spend/not spend must bemade in a structured context.The objective of this research is to develop such an instrum ent a score card for evaluating overall ISFP.

    The Theoretical Bases for the StudyTH E DEFINITION OF THE ISFUNCTION that is used here includes all IS groups anddepartments within the organization [84].This definition is broad enough to includevarious structures for the IS function, from centralized to distributed, yet specificenough to include only the formal IS function that can be readily identified.

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    3/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 87

    Resources- hardware- software- humanresources- integratedmanagerialand technicalcapabilities

    IS FunctionIS FunctionOutputs

    - systems- information- services

    IS FunctionalPerformance\

    usinessProcessEfTectiveness

    OrganizationaiPerformance

    igure 1 Theoretical Input-Output Performance Model

    tion uses resources to produce IS performance, which in turn influences both busi-ness process effectiveness and organizational performance.

    The resources utilized by the IS function are shown in Figure 1 to be hardware,software, human resources, and integrated m anagerial and technical capabilities [14,15,36], The IS function is shown to produce systems, information, and services [56,92],which co llectively affect the organization in a fashion that is termedIS functionalperformance(ISFP ), which is to be assessed through anIS unctionalscorecard(ISFS),the development of which is the objective of this study.

    In the theoretical model, IS outputs are also shown as significant enablers and driv-ers of business p rocess effectiveness, since IS are often the basis for business p rocessoperations and redesign [94, 113], ISFP also is shown to infiuence business processeffectiveness, and both influence overall organizational performance [113],

    Although it is not the primary purpose of this study to directly address the businessprocess effectiveness and organizational performance elements of Figure 1, data werecollected on these elements of the model for purposes of nomological validation ofthe scorecard that is being developed.

    The model of Figure 1 is based on stream s of research in IS capab ilities, IS effec-tiveness/success, IS service quality, IS functional evaluation, and IS subfunctionalassessment.

    IS C apabilitiesIS capabilities are integrated sets of hardware, software, human skills, and manage-

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    4/32

    8 8 JERRY CHA-JAN CHA NG AND WILLIAM R. KING

    technical people, adequate planning budgets, and a well-formulated and specifiedplanning process.

    IS Effectiveness/SuccessDeL one and McLean [30] categorized over 100 IS depende nt variables into sixcategories and developed an IS success model to describe the relationships betweenthe categories. They concluded that Ssuccess should be a multidimensional m easureand recommended additional research to validate tbe model. Other researchers havesince tested and expanded their model [7, 46, 79]. DeLone and McLean [31] haveupdated the model based on a review of research stemm ing from their original w ork.They concluded that their original model was valid and suggested that service qual-ity be incorporated as an important dimension of IS success.

    IS Service QualityRecognizing the importance of the services provided by tbe IS function, theSERVQUAL measure, originally developed in marketing [74], has been adapted tomeasure IS service quality [7 5,1 10 ]. However, the controversy over SERVQUA L inmarketing [27] has carried over into IS [5 2,1 04 ], suggesting that more research needsto be conducted to measure IS service quality. Proponents of this measure som etimesadvocate its use as proxy for ISFP. However, as depicted in Figure 1, it is directlyapplicable only to one of the three major outputs of the IS function.

    IS Functional EvaluationOnly a few studies directly add ress the comprehensive evaluation of tbe performanceof the IS function. No one has developed a validated metric. Wells [112] studiedexisting and recommended performance measures for the IS function and identifiedsix important goals/issues. Saunders and Jones [84] developed and validated 11 ISfunction performance dimensions through a three-round D elphi study. They proposedan IS function performance evaluation model to help organizations select and priori-tize IS performance dimensions and to determine assessments for each dimension.Both studies focused on top man agem ent's perspective of ISFP and did not offer anyspecific measures.

    IS Subfunctional AssessmentMe asuring IS subfunctional performance has been important to IS practitioners andacademics, and such measures have been developed at a variety of levels using a

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    5/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 89

    Table 1. Implementafion of Cameron and W hetton 's [19] G uidelinesGuidelines Implementations1 . From whose perspective is effectivenessbeing assessed?2. On wbat domain of activity is theassessment focused?3. Wbat level of analysis is being used?4. What is tbe purpose for judgingeffectiveness?5. Wbat time frame is being employed?6. Wbat type of data are being used for

    judgments of effectiveness?7. What is tbe referent against whicheffectiveness is judged?

    Organizational users of IS services andsystems.Products and services provided by tbe ISfunction.Tbe IS function [84 ].Identify strengths and weaknesses; trackoverall effectiveness.Periodically ranging from quarterly toannually.Subjective; perceptual data fromindividual.Past performance measures.

    financ ial perspecfive (e.g ., [10]), a social science perspective (e.g., [80]), an IT valueapproaeb [24], a business process viewpoint (e.g., [98]), and probably others.

    So,there has been no paucity of interest in IS assessment, or in the development ofmea sures. However, there is a great need for a com prehensive measure of S perfor-mance that will provide a configural, or Ge stalt [41], view of an organizafion'sformal IS acfivities and facilitate decision making and functional improvement.

    The ]VIethodological Basis for the StudyT o ENSURE THE APPROPRIATENESS OF THE STUDY at th e IS fun cti on al lev el , it w asdesigned according to guidelines from the organizational effectiveness literature. Theseguidelines were developed in response to problems plaguing organizational effec-fiveness research as described by Steers [9 5]. Came ron and Whetton [19] developedseven basic guidelines that are listed in the lefthand column of Table I. Cameron [18]later dem onstrated the usefulness of these guidelines in a study of 29 organizations.These guidelines have also been adopted by IS researchers to clarify conceptual de-velopm ents in exam ining IS funcfional effecfiveness [69, 88 ].

    The implementafions of Cameron and Whetton's [19] guidelines for this study areshown in the dghthand column of Table 1. Thus, the ISFS developed here is definedas organizational IS users' percepfion of the performance for all of the aspects of theIS function that they have personally experienced. Organizafional users of IS servicesand systems are the primary stakeholder for the IS function [92]. Although there aremany other stakeholders for the IS function, users represent the largest group, and

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    6/32

    9 0 JERRY CHA-JAN CHA NG AND WILLIAM R. KING

    Despite its focus on users, this approach is different from the popular user satisfac-tion measures [ 6, 8, 35], because it is designed to assess people's perceptions of theoverall IS function rather than to capture users' attitudes toward a specific system.

    The Dom ain and Operationalization of theIS Performance ConstructUSERS' PERCEPTION OFISACT IVITIESderive from their use oftheIS products andthe services provided by the IS function. IS research has traditionally separated theeffect of systems and information as two distinct constructs [30]. However, systemand information quality are attributes of app lications, not ofISdepartments [87, p.244]. Therefore, they are not sufficient to reflect the effectiveness of the entire ISfunction.

    Domain of the ISFS ConstructThe domain of ISFP used in this study reflects the theory of Figure and the modelssuggested by Pitt etal.[75] and Delone and McLean [31].The de finitions ofthethreebasic output-related dimensions are given below. A m odel of the ISFS con struct, us-ing LISREL notation, is presented in Figure 2.

    Systems performance Assesses the quality aspects of systems such as reliabil-ity, response tim e, ease of use , and so on, and the various impacts that systemshave on the user's work. Syste ms encomp ass all IS applications that the userregularly uses.

    Information effectiveness Assesses the quality of information in terms of thedesign, operation, use, and value [108] provided by information as well as theeffects of the information on the user's job. The information can be generatedfrom any of the systems that the user makes use of.

    Service performance Assesses the user's experience with services provided bythe IS function in terms of quality and flexibility [38].The services provided bythe IS function include activities ranging from systems development to helpdesk to consulting.

    In order to develop a measurement instrument with good psychometric properties,the updated paradigm that emp hasizes establishing the unidimensionality of mea-surement scales [40, 89] was followed. A cross-section mail survey is appropriate toobtain a large sample for analysis and to ensure the generalizability of the resultinginstrument.

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    7/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 91

    SystemsPerformanceSYSP

    InformationEffectivenessINFOE

    ServicePerformanceSERVP

    Figure2,Three-Dimensional Model of ISFS

    constructs (business process effectiveness and organizational performance), as shownin Figure 1, were used to assess nomological validity. Whenever possible, previouslydeveloped items that had been empirically tested w ere used or adopted to enhance thevalidity and reliability of the instrument under development. Some new measureswere also developed from reviews of both practitioner and research literatures toreflect developments that have occurred subsequent to the development of the mea-sures from which most items were obtained (e,g,, e-commerce, enterprise resourceplanning [ERP], etc).

    The three output dimensions of Figure are the basis for three ISFS d imensions.

    Systems PerformanceMea sures of systems performance assess the quality aspe cts of systems and the vari-ous effects that IS have on the user's work. Em pirical studies listed under the catego-ries system quality and individual impac t in DeL one and M cLean 's [30] IS SuccessModel were reviewed to collect the measure used in those studies. In addition, instru-

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    8/32

    9 2 JERRY CHA-JAN CHAN G AND WILLIAM R. KING

    included for more updated measures published subsequent to DeLone and McLean'soriginal review.

    Information EffectivenessMeasures of information effectiveness assess the quality of the information providedby IS as w ell as the effects oftheinformation on the user'sjob.Although D eLone andM cLe an's [30] information quality provided a good source for existing measures,Wang and Strong [109] developed a more comprehensive instrument that encom-passes all measures mentioned in DeLone and McLean's review. Therefore, the 118measures developed by Wang and Strong make up the majority of items in this di-mension. However, since the focus of their instrument is on quality of information, inorder to ensure coverage of m easures on the effects of information on the user's job,some new items were developed.

    Service PerformanceMea sures of service performance assess each user's experience witb the services pro-vided by tbe IS function in terms of the quality and flexibility of the services. Theentire IS-SERVQUAL instrument is included in this dimension for comprehensive-ness. New measures were also incorporated to augment the IS-SERVQUAL itemsbased on the more comprehensive view of service performance proposed by Fitzgeraldetal.[38]. In addition, literature on three areas of Sfunctional services that were notexplicitly covered by the service quality literaturetraining [6 0, 65 ,7 1] , informationcenters [1 1, 44, 47, 66 ], and help desks [20]w ere also reviewed and included toensure tbe comprehensiveness of measures for this dimension.

    In addition to utilizing existing items to measure these constructs, the emergence ofinnovations that have come into use since most of the prior instruments were developedprompted tbe inclusion of new items to measure the IS function's performance in sevennew areas: ERP [51], know ledge management [45,6 4,97],electronic business [9,55],customer relationship management [39], supply chain management [37], electroniccommerce [22, 33, 102], and organizational learning [86, 100]. In total, 31 new itemsgleaned from the practitioner and research literatures to reflect potential user assess-ments of Sfunction's contribution to those areas in terms of system s, information, andservices were incorporated to expand the item pools for each dimension.

    Instrument DevelopmentA total of 378 items were initially generated. Multiple rounds of Q-sorting and itemcategorization w ere conducted [29, 68] to reduce the num ber of items and to ensurethe content validity of the ISFS instrument. The last round ofg sort resulted in the

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    9/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 93

    Table 2. Sub-ISFS C onstructs from g-S ortSystems Informafion Serviceperforman ce effectiveness perform anceEffect on job Intrinsic quality of ResponsivenessEffect on external information Reliability

    constituencies Contextual quality of Service provider qualityEffect on internal information Empathyprocesses Presentational quality TrainingEffect on knowledge of information Flexibility of servicesand learning Accessibility of Cost benefit of servicesSystems features informationEase of use Reliability of informationFlexibility of informationUsefulness of information

    The g -sor ts resulted in an ISFS instrument that consists of4 2,36 , and 32 items forthe three dimensions, respectively. All items were measured using a Likert-type scaleranging from (hardly at all) to 5 (to a great extent) with 0 denofing not app licable.Tbe final version of the instrument is in the Appendix.

    Survey Design and ExecutionA samp le of 2,100 medium-to-large com panies w ith annual sales over $250 millionwas randomly selected from Hoo ver's Online (www .hoovers.com) and Informafion-Week 500.

    To avoid comm on-source bias, data were collected from two types of respondentsin each of the sampled organizafions. Data for the ISFS instrument were collectedfrom IS users, and organizational CIO s were asked to respond to a Consequences ofISFP survey which was used as a basis for establishing nomological validity.A packet consisfing of one Consequ ences of ISF P instrument and three ISFSinstruments was sent to tbe CIOs of these comp anies. The CIO was asked to respondto the Consequ ences of ISF P survey and to forward ISFS instruments to three ISusers. The characteristics of desirable user-respondents in terms of various func-tional areas, familiarity with IS, and so on, were specified.

    The CIO is deemed to be suitable for receiving the packet because the topic of thisresearch w ould be of great interest to bim or her, tberefore increa sing the potential forparticipation. The CIO is also an appropriate respondent to the Consequence of ISF Psurvey because he or she is at a high-enou gh po sition to provide m eaningful responsesconcerning consequences. Although it is possible tbat tbe CIO migbt distribute the

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    10/32

    94 JERRY CHA-JAN CHA NG AND WILLIA M R. KING

    Two rounds of reminders were sent to initial nonrespondents to improve the re-sponse rate. In addition, where appropriate, letters were sent to the CIOs who re-turned the CIO survey soliciting additional user participation.

    At the conclusion of data collection in2001,346 usable ISFS instruments and 130 Consequences ofISFS surveys were received, with 120 com panies having responsesfrom at least one IS user and the CIO. This resulted in a response rate of 7.2 percentfor the CIO survey, 5.6 percent for the ISFS questionnaire, and 6.1 percent matched-pair responses.

    Two analyses were conducted to assess possible nonresponse bias,f testsof com-pany size in terms of revenue, net income, and number of employees between re-sponding and nonresponding companies showed no significant differences, r-tests of30 items random ly selected from the three ISFS dimensions (10 items each) betweenthe early (first third) and late (last third) respondents [4, 62] also showed no signifi-cant differences. Th erefore, it can be concluded that there was no nonresponse bias inthe sample and that the relatively low percentage response rate does not degrade thegeneralizability of the ISFS instrument [57].Sample DemographicsThe participating com panies represent more than 2 0 industries with nearly a quarterofth companies in manufacturing (24.6 percent), followed by wholesale/retail (13.8percen t), banking/finance (10 .8 percent), and med icine/health (7.7percent).The rangeof annual sales was between $253 million and $45,352 billion, with an average of $4billion for the sample. For the Consequences of ISFP surveys, 46.9 percent of therespond ents hold the title ofCIO.M ore than 80 p ercent of the respondents have titlesthat are at the upper-management level, indicating that the returned surveys wereresponded to by individuals at the desired level. For the ISFS instrumen t, 47.7 percentof the respondents are at the upper-management level and 39.9 percent were at themiddle management level. The respondents are distributed across all functional ar-eas,with accounting and finance , sales and ma rketing, and manufacturing and opera-tions being the top three.Instrument ValidationINSTRUMENT VALIDATION REQUIRES THE EVALUATION of content validity, reliability,construct validity, and nomological validity. Following Segars's [89] process for in-strument validation, we first use exploratory factor analysis to determine the numberof factors, then use confirmatory factor analysis iteratively to eliminate items thatloaded on multiple factors to establish unidimensionality.

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    11/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 95

    a sample of a universe of the investigator's interest and by defining a universe ofitems and sampling systematically within this universe [26, p, 58], Churchill [25]recom men ded specifying the domain of the construct followed by generating a sampleof items as the first two steps in instrument development to ensure content validity.Domain de velopment should be based on existing theories, and sam ple items shouldcom e from existing instruments, with the development of new items w hen necessary.

    In this study, domain development was guided by theories in both organizationaleffectiveness and IS research. Items from existing instrumen ts formed the overwhelm -ing majority of the item po ol. The initial items w ere refined through a series of Q-sortsand a pilot test. These development procedures ensured the content validity of theinstruments,

    Unidimensionality and Convergent ValidityUnidimensionality requires that only a single trait or construct is being m easured bya set of measures and is the most critical and basic assumption of m easurementtheory [50, p, 49 ], Gerbing and Anderson suggest that confirmatory factor analysisaffords a stricter interpretation of unidim ensionality [40, p, 186] than other com-monly used methods. Although the subconstructs of the three basic dimensions de-scribed earlier were identified du ring the g-so rt, those factors needed to be empiricallytested. Therefore, exploratory factor analyses were first conducted for items withineach dimension to determine the factors. This is acceptable, since the items for eachdimension were clearly separated in the instrument into sections with opening state-ments describing the nature of the items in the sections.

    Three separate exploratory factor analyses were conducted using principal compo -nents with varimax rotation as the extraction method. There were seven, seven, andfive factors with eigenvalues greater than 1,0 that explained 70,8 percent, 68,6 per-cent, and 69,6 percent of variance for systems perform ance, information effective-ness, and service performance, respectively. Review of the items showed that mostfactors loaded very closely to the subconstructs identified by the Q-sort,

    To establish unidimensionality, the items that loaded on the same factor were thenanalyzed with confirmatory factor analysis using LISREL with two exceptions. Onefactor in systems performa nce had only one item. Since it is one of the original ease -of-use items from Davis [29], it was included into the factor that contains therest of the ease -of-u se items. Another factor in information effectiveness hadonly three item s. It would be just identified for confirmatory factor analysis andwas only analyzed in conjunction with other factors in the same dimension. Thisprocess resulted in six factors for systems perform ance , six for information effective-ness,and five for service performance,

    Segars and Grover suggest that me asured factors be modeled in isolation, then in

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    12/32

    9 6 JERRY CHA .JAN CHA NG AND WILLIAM R. KING

    fit. First, items with standardized factor loading below 0.45 were eliminated [78] oneat atime.Second, error terms between pairs of items were allowed to correlate basedon a modification index. However, this modification was only implemented whentheories suggested that the two items should be correlated. This process was con-ducted iteratively by making one modification at a time until either good model fitwas achieved or no modification was sugge sted.

    Following Segars and Grover's [90] procedure, after every measurement modelcompleted its modification process, pairs ofmo elswithin each dimension w ere testediteratively to identify and eliminate items with cross-loadings. With all cross-loadingitems eliminated, all factors within the same dimension were tested in a full measure-ment mo del. Again, items w ith cross-loadings in the full model w ere dropped. Afterthe full measurement models were purified, second-order models tbat reflect the sub-constructs within each ISFS dim ension w ere tested. Tbe final, second-order measure-ment models for tbe three ISFS dimensions are presented in Figures 3, 4, and 5.

    The chi-square and significant factor loadings provide direct statistical evidencesof both convergent validity and unidimensionality [91]. With each of the three ISFSdimensions properly tested independently, all three dimensions were combined andtested for model fit (Figure6 ).The complete ISFS m odel, shown in Figure 6, showedremarkably good fit for such high complexity.

    ReliabilityIn assessing measures using confirmatory factor analysis, a composite reliability foreach factor can be calculated [5, 93 ]. This com posite reliability is a measure ofinternal consistency of the construct indicators, depicting the degree to which they'indic ate' the comm on latent (unobserved) construct [48, p. 612 ]. Another measureof reliability is the average variance extracted (AVE), which reflects the overall amountof variance that is captured by the construct in relation to the amount of variance dueto measuremen t error [48, 89]. The value of AVE should exceed 0.5 to indicate thatthe variance explained by the construct is larger than measurement error. Tbe con-struct reliability and AVE of all dimensions and subconstructs are presented in Table 3.

    Table 3 indicates that all subconstructs showed good composite reliability exceptthe IS training scale. However, there are some scales with an AVE below 0.50. Thissuggests that even though all scales (except one) were reliable in measuring theirrespective construc ts, some of them w ere less capable of providing good measures oftheir own construct. Despite the low AVE, those scales were retained to ensure thecomprehensiveness of the ISFS instrument.

    Discriminant Validity

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    13/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 97

    12Sijl - easier to do job

    - improve perfonnanceSij3 - improve decisionSij4 - g ive confidence

    l 8 >|SijS - increase productivity^ Sij6 - decision participation

    I Bfi 1 Sij7 - increase awarenessSij8 - improve work quality |

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    14/32

    98 JERRY CHA-JAN CHANG AN D WILLIAM R. KING

    y | liq l - interpretable0.30

    .7(|Iiq2- understandablei 73

    0.85>|lril-reliable J< 0.27-verifiable _]< 0.40

    i 77|lcql - important-relevant _Jllpj Iai3- up-to-date |< 0.43

    .0.66

    Iai4 - received in timely nianner|jIfi3 - easily in te gr at ^

    -easily updated

    0.78Iui2 - d efining problem s

    |Iui3 - making decisions1.8185>| Iui4 - improve efficiency79 .

    J< 0.27 0.13

    0.83 . Iui6 - give competitive edge |

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    15/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 99

    84

    0.93

    0,71

    0.86

    jj)f| S rp l, - responds timely | | Ssp3 - show respect

    Ssp4 - pleasant to work with \4 0.16

    0.57 Stg l - useful training programs 0,45i.46| Stg2 - variety of training

    68 1 1^ S cbl - cost-effective services r

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    16/32

    100 JERRY CHA-JAN CHANG AND WILLIAM R. KING

    0.65

    Impact on job

    Impact on external constituenciesImpact on internal processes

    Impact on knowledge and leaming

    Systems usage characteristics

    Intrinsic systems quality

    ity of information

    Reliability of information

    ,.-:yC^j__Contextual quality of infonnation

    O.76->C[^_ftsentational quality of in f o r m a ti o i i_ ^85. , - - ~ ~Accessibility of information

    Flexibility of infonnation

    Usefulness of information

    Responsiveness of services

    Intrinsic quality of service provider

    51-^^ ^t er pe rs on al quality of service provider

    IS training0.91

    Rexibility of services

    Figure 6. The Complete ISFS ModelNotes:y} 3,164.90; d.f. = 2,094;p = 0.00; RMSEA = 0.039; GFI = 0.79; AGFI = 0.77.

    This construct captures the IS function s contribution to the overall performance ofthe organization. Tbe literature has focused on assessing tbe extent to which the IS

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    17/32

    MEASURING THE PERFORMANCE O F INFORMATION SYSTEMS 101

    Table 3. ReliabilityofMeasurement F actorsinISFS DimensionsFactor namesSystems performance

    Impact on jobImpact on external constituenciesImpact on internal processesImpact on knowledge and learningSystems usage cbaracteristicsIntrinsic systems quality

    Information effectivenessIntrinsic quality of informationReliability of informationContextual quality of informationPresentational quality of informationAccessibility of informationFlexibility of informationUsefulness of information

    Service performanceResponsiveness of servicesIntrinsic quality of service providerInterpersonal quality of service providerIS trainingFlexibility of services

    Reliability0 920 950 880 890 890 850 790 920 730 790 850 870 800 810 910 890 880 840 930 590 69

    V0 660 680 560 800 620 490 560 630 480 660 750 770 570 580 660 630 790 560 820 330 37

    considered to beacceptablein theliterature [32 107], seven items tbat assesstheCIO's perceptionofIS 's contributiontoimproving the organization's performanceinthose areas were used.

    Business Processes EffectivenessAside from directly affecting organizational perfo rman ce, the IS function should alsohavean effect on organizational perfonnance throughitsimpa ct ontbe effectivenessof business processes,asshowninFigure 1.IShave traditionally been implementedto improvetheefficiencies ofinternal operations. Thisuse of IT hasmore recentlybeen appliedin redesigning both intra-and interfunctional business processes [94].

    Improvements to tbe value-cbain activities through IT arecaptured in thiscon-struct. BasedonPorterandMillar [76]and Devenport [28],Xia[113] developeda39-item instrument to assess executives' perceptionof theextent towhich IT im-proved theeffecfiveness of six value-chain activifies. Data analysis resulted in sixfactors: production operations, product development, supplier relations, marketingservices, management processes,and customer relations. Items representing thosesix factors were generatedfortbis construct.

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    18/32

    102 JERRY CHA JAN CHANG A N D WILLIAM R. KING

    Table 4. Chi-Square Differences Between Factors

    Factor 1Systems performanceFactor2Factor3Factor4Factor5Factor6

    39.61***31.33***26.31***66.45***49.47***Information effectiveness

    Factor2Factor3Factor4Factor5Factor6Factor7

    63.61***80.96***64.16***65.65***62.68***67.31***Service performanceFactor2Factor3Factor4

    Factor5***p< 0.001.

    23.84***51.52***46.31***30.69***

    Factor2

    37.00***38.67***78.42***64.66***

    73.22***48.21***40.60***52.31***47.84***

    52.93***48.52***46.89***

    Chi-squareFactor3

    28.55***59.97***55.23***

    75.03***65.60***78.87***67.80***

    73.42***75.73***

    differencesFactor4 Factor5

    65.97***58.68*** 70.76***

    47.34***60.04*** 48.66***57.82*** 47.27***

    53.93***

    Factor6

    47.36***

    since a different sample was used, tests were conducted to ensure the reliability andvalidity of those con structs. Reliability w as evaluated using C ronb ach s alpha . Itemswith low corrected item-total correlation, indicating low internal consistency for theitems, were dropped. Construct validation was assessed by exploratory factor analysisusing principal components with oblique rotation as the extraction method. Table 5presents the results of the analyses.

    Although both constructs had two factors extracted, the two factors were signifi-cantly correlated in both cases. Therefore, all items within each construct were re-tained and used to create an overall score for the construct. I tems in the f inalmeasurem ent m odels were used to create an overall score for the ISFS con struct. Theaverage of a ll i tems for each construct was used to avoid problem s that may occu r dueto differences in measurement scales. Table 6 shows the correlation among the con-structs.

    As shown in Table 6, there were significant positive correlations between the ISFSconstruct and the two consequence constructs. There was also signif icant positivecorrelation between business processes effectiveness and organizational performance.

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    19/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 103

    Table 5, Reliability and Validity of Nomological ConstructsNumber of

    factorsConstructs extractedBusiness processes 2effectivenessOrganizational 2performance

    Table 6, Correlation of Constructs in

    ConstructsOrganizational performanceISFS

    Number ofitemsretained

    67

    the NomologicalBusinessprocesses0,7500,214

    ** Correlation is significant at the 0,01 level (two-tailed).

    Varianceexplained(percent)

    63,4870,04

    NetworkOrganizationalperformance

    0,205

    Cronbach'salpha0 7590 860

    Results, Limitations, and JVIanagerial UsesT H EISFSINSTRUMENT IS A COMPREHENSIVE ONEthat has been designed to m easurethe performance of the entire IS function. The instrument consists of three majordimensions: systems performance, information effectiveness, and service performance.Each dimension contains several unidimensional subconstructs, each of which ismeasured by at least two items. All scales have high reliability. Evidence from dis-criminant validity analyses showed that each scale is measuring a construct that isdifferent from the other constructs.

    Of course, som e limitations to the instrument need to be pointedout.The sample size,while large, especially for ma tched-pair survey studies, is borderline for the num-ber of variables relative to the number of observations that are involved in the SEManalysis. Thus, some caution should be taken until it is revalidated. The nonresponsebias analysis was conducted by comparing early responders to late responders and interms of organizational size-related variables for responders and nonresponders. Al-though this is comm on practice, other variables might have been analyzed [57], Wealso note that two subconstructs IS training and flexibility of services wereborderline with respect to reliability. Despite this, these items were retained for com-prehensiveness or theoretical soundness. Further studies will need to explore and im-prove these items. The ISFS may therefore be thought ofasa preliminary step that canguide future research and enhance practice in a significant, but limited, way.The ISFS integrates aspects of various philosophical approaches that have been

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    20/32

    104 JERRY CHA .JAN CHA NG AND WILL IAM R. KJNG

    IS activities as reflected in the hierarchical structure of 8unidimensional subconstructswithin the three dimensions. This allows IS m anagers to use this instrument to assesstheir strengths and weaknesses in those subconstructs and dimensions. Of course,since the dimensions are not independent, the fact that one dimension may be lowdoes not tell the whole story. Thus, such an indication must be further assessed inlight oftheoverall Gestalt ofdimensions.In this sense, the ISFS also allows the ISfunction to pinpoint specific areas that need improvements and to track both theseareas and overall performance over time, thus providing the basis for continuousimprovement [73].

    When used in large organizations witb decentralized IS functions, the ISFS instru-ment can offer valuable insights for internal benchm arking. C omparing the results ofISFS instruments from different divisions or areas would help identify areas of ISexcellence and facilitate the transfer of knowledge to other areas. It has already beenused in each of these ways in a number of organizations that participated in the study.

    In addition, with the intensifying scrutiny on IT investment, analysis, and outsourcing,the ISFS instrument can be very useful to establish a baseline on the current status ofISFP.Comparing postinvestment or outsourcingISperfonnance to the baseline wouldprovide a more objective evaluation of the efficacy of the actions taken . Th is, in turn,will allow the organization to develop follow-up actions to maximize IS performanceand, ultimately, to improve organizational performance.

    Thus,the instrument can be used in various w aysas an overall evaluative tool, asa Ge stalt of areas that may be tracked over time, or in evaluating specific suba reas.At this latter level, it also provides m eans of identifying the specific perform anceareas, as represented by tbe subconstructs, that may need improvem ent.

    Because of the use of data from a cross-sectional field survey for validation, theISFS instrument is applicable to a variety of industries. When used w ithin an organi-zation, the instrument should be administered to a range of systems users, in terms ofboth functional areas and organizational levels. This w ould ensure appropriate repre-sentations of the diverse users in the organization. The average scores for each sub-construct or dimension are the indicators of the IS function's performance for thespecific subarea or dimension. To be effective, the ISFS instrument should be ad min-istered repeatedly at a fixed interval between quarterly and annually. The result oflater assessments should be compared to earlier evaluations to detect changes thatwould indicate improvements or degradation in the IS function's performance and inspecific performance areas. One additional caveat may be useful. The nomologicalvalidation was performed in terms of the overall ISFS score. As a result, there is noassurance that the subsc ores have the same degree of nomological validity. Sincesuch an analysis is beyond the scope of this study, we leave it to others who may wishto concern them selves with this issue.

    Overall, the goal of developing a measure to assess the performance of the IS func-tion was successfully achieved in this study. The resulting instrument is not only

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    21/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 105

    tion and for research ers to use in studies that require ISFP as a dependent or indepen-dent construct, as well as in studies that seek to comp lement the ISFS througb otberanalyses.

    R E F E R E N C E S1. Alavi, M.; Marakas, G.M.; andYo o,Y. A comparative study of distributed learning envi-ronments on learning ou tcomes. Information Systems Research 13 4 (December 2002), 40 4-415.2. Anderson, J.C. An approach for confirmatory measurem ent and structural equation m od-eling of organizational properties. Management Science 33 4 (April 1987),525-541.3. Anderson, J.C., and Gerging, D.W. Structural equation modeling in practice: A reviewand recommended two-step approach. Psychological Bulletin 103 3 (May 1988) , 4 1 1^ 23 .4.Arm strong, J.S., and O verton,T.S. Estimating nonresponse bias in mail surveys. Journalof Marketing Research 14 3 (August 1977), 396-402.5. Bagozzi, R.P. An examination of the validity of two models of attitude. MultivariateBehavioral Research 16 3 (July 1981), 323 -359 .6. Bailey, J.E., and Pearson, S.W. Development of a tool for measuring and analyzingcomputer user satisfaction. Management Science 29 5 (May 1983), 53 0-545.7. Ballantine, J.; Bonner, M .; Levy, M.; M artin, A.; Mo nro, I.; and Pow ell, P.L. Developinga 3-D m odel of information systems succe ss. In E.J. Garrity and G .L. Sanders (eds.). Informa-tion Systems Success M easurement.H ershey, PA: Idea G roup, 1998, pp. 46-5 9.8. Barou di, J.J., and Orlikow ski, W.J. A short-form measu re of user information satisfac-tion: A psychometric evaluation and notes on use.Journal of Management Information Sys-

    tems 4 4 (Spring 1988), 44-59.9. Basu, A., and Kumar, A. Workflow management issues in e-business. Information Sys-tems Research 13 1 (March 20 02), 1-14.10. Benaroch, M. Managing information tecbnology investment risk: A real options per-spective. Journal of Managem ent Information Systems 19 2 (Fall 2002), 43-84.11. Bergeron, F.; Rivard, S.; and De Serre, L. Investigating the support role of the informa-tion center.M IS Quarterly 14 3 (September 1990), 247-260.12. Brancheau, J.C, and Wetherbe, J.C. Key issues in information systems management.MIS Quarterly 11 1 (March 1987), 23 ^ 5 .13.Brancheau, J. C ; Janz, B.D.; and Wetherbe, J. C Key issues in infonnation systems m an-agement: 1994-95 SIM Delphi results. MIS Quarterly 20 2 (June 1996), 225-242.14. Broadbent, M., and Weill, P. Management by Maxim: How business and IT managerscan create IT infrastructers. Sloan Management Review 38 3 (Spring 1997), 77 -92 .15. Broadbent, M .; Weill, P.; O Brie n, T ; and Neo , B.N. Firm context and patterns of ITinfrastructure capability. In J.I. DeGross, S.L. Jarvenpaa, and A. Srinivasan (eds.).Proceed-ings of Seventeenth International Conference on Information Systems.Atlanta: Association forInformation Systems, 1996, pp. 174-194.16. Brynjolfsson, E. The productivity paradox of information technology. Communicationsof theACM 36 12 (December 1993), 67-77.17. Byrd, T.A., and Turner, D.E. M easuring the flexibility of information technology infra-structure: Exploratory analysis of a construct. Journal of Management Information Systems17 1 (Summ er 2000), 167-2 08.18. Cameron, K.S. A study of organizational effectiveness and its predictors. ManagementScience 32 1 (January 1986), 87-1 12.

    19. Cameron, K.S., and Whetton, D.A. Some conclusions about organizational effective-ness.In K.S. Cameron and D.A. Whetton (eds.). O rganizational Effectiveness: A Com parison

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    22/32

    1 06 JERRY CHA-JAN CHANG AND WILLIAM R. KING

    2 1. Chan, Y.E.;Huff S.L.; Barclay, D.W.; and Copeland, D.G. Business strategic orienta-tion, information systems strategic orientation, and strategic alignment. Information SystemsResearch 8 2 June 1997), 125-150.22 . Chatterjee, D.; Grewal, R.; and Sambamurthy, V. Shaping up for e-commerce: Institu-tional enablers of the organizational assimilation of Web technologies.MIS Quarterly 26 2 June 2002), 65-90.

    2 3 . Chatterjee, D.; Pacini, C ; and Sambam urthy, V. The shareholder-wealth and trading-volume effects of information technology infrastructure investments.Journal of ManagementInformation Systems 19 2 Fall 2002), 7 ^ 2 .2 4. Chircu, A.M., and Kauffman, R.J. Limits to value in electronic commerce-related ITinvestments. Journal of Managem ent Information Systems 17 2 Fall 2000), 59-80 .2 5. Churchill, G.A. A paradigm for developing better measures of marketing constructs.Journal of Marketing Research 16 1 February 1979),64 -73 .2 6. Cronbach, L.J., and Meehl, P.E. Construct validity in psychological tests. In D.M. Jack-son and S. Messick eds.).Problems in Human Assessment. New York: McGraw-Hill, 1967,pp . 57-77.27. Cronin, J.J.J., and Taylor, S.A. SERVPERF versus SERVQUAL: Reconciling perfor-mance-based and perceptions-minus-expectations measurement of service quality. Journal ofMarketing 58 1 January 1994), 125-131.28. Davenport, T.H.P rocess Innovation: Reengineering Work Through Information Tech-nology. Boston: Harvard Business School Press, 1993.29.Davis, F.D. Perceived usefulness, p erceived ease of use, and user acceptance of informa-tion technology.M IS Quarterly 13 3 September 1989), 31 9-3 40.30.DeL one, W.H., and M cLean, E.R. Information systems succe ss: The quest for the depen-dent variable. Information Systems Research 3 1 March 1992), 60 -95 .31. DeLone, W.H., and McLean, E.R. The DeLone and McLean model of information sys-tems success: A ten-year update. Journal of Managem ent Information Systems 19 4 Spring

    2003), 9-30.32 .Dess, G.G., and Robinson, R.B.J. Measuring organizational performance in the absenceof objective measures: The case of the privately-held firm and conglomerate business unit.Strategic Managem ent Journal 5 3 July-September 1984), 2 65-2 73 .33.Devaraj, S.; Fan, M.; and K ohli, R. Anteced ents of B2 C channe l satisfaction and prefer-ence: Validating e-commerce metrics. Information Systems Research 13 3 September 2002 ),31 6 - 333 .34. Dickson, G.W.; Leitheiser, R.L.; Nechis, M.; and Wetherbe, J.C. Key information sys-tems issues for the 1980s. M IS Q uarterly 8 3 September 1984), 135 -148.35.Do ll, W.J., and Torkzadeh, G. The measurem ent of end-user co mpu ting satisfaction.M ISQuarterly 12 2 June 1988), 2 59-2 74.36.Duncan, N.B. Capturing flexibility of information technology infrastructure: A study ofresource characteristics and their measure.Journal of Managem ent Information Systems 12 2 Fall 1995), 37-57.37. Fan, M.; Stallaert, J.; and Whinston, A.B. Decentralized mechanism design for supplychain organizations using an auction market. Information Systems Research 14 1 March2003), 1-22.38.Fitzgerald, L.; Johnston, R.; Brignall, S.; Silvestro, R.; and Voss, C. erformanceMeasure-ment in Service Businesses.London: Chartered Institute of M anagement A ccountants, 1993.39. Gefen, D., and Ridings, C M . Implem entation team responsiveness and user evaluationof customer relationship management: A quasi-experimental design study of social exchangetheory.Journal of Managem ent Information Systems 19 1 Summer 2002 ), 47 -70.40.Gerbing, D.W, and Anderson, J.C. An updated paradigm for scale development incorpo-rating unidimensionality and its assessment. Journal of Marketing Research 25 2 May 1988),186-192.41. Glazer, R. Measuring the knower: Towards a theory of knowledge equity. California

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    23/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 107

    43 , Goodhue, D,L,, and Thompson, R,L, Task-technology fit and individual performance,MIS Quarterly,19, 2 June 1995), 213 -23 6,44, Govindarajulu, C , and Reithel, B,J, Beyond the information center: An instrument tomeasure end-user computing support from multiple sources.Information Managem ent, 33,5 May 1998), 241 -250,45, Grover, V,, and Davenport, T,H, General perspectives on knowledge management: Fos-tering a research agenda.Journal of Management Information Systems, 18, 1 Summer 2001),5-22,46, Grover, V,; Jeong, S,R,; and Segars, A,H, Information systems effectiveness: The con-struct space and patterns of application.Information Managem ent, 31,4 December 1996),177-191,47, G uimaraes, T , and Igbaria, M, Exploring the relationship between IC success and com-pany performance. Information Managem ent, 26, 3 March 1994), 133-141,4 8,Hair, J,F,J,; Anderson, R,E,; Tatham, R,L,; and Black, W,C,M ultivariate Data Analysis,5th ed. Upper Saddle River, NJ: Prentice Hall, 1998,49, Hartog, C , and H erbert, M, 1985 opinion survey of MIS m anagers: Key issues,MISQuarterly, 10,4 Decem ber 1986), 351-361,50 , Hattie, J, Methodology review: Assessing unidimensionality of tests and items.AppliedPsychological Measurement, 9, 2 June 1985), 139-164,51 ,Hitt, L,M,; Wu, D,J,; and Zhou, X, Investment in enterprise resource planning: Businessimpact and productivity measures.Journal of Managem ent Information Systems, 19, Sum-mer 2002), 71-98 ,52 , Jiang, J,J,; Klein, G,; and Carr, CL, Measuring information system service quality:SERVQUAL from the other side,M IS Quarterly,26, 2 June 2002), 145 -166,53 ,Joreskog, K,G, Testing structural equation mo dels. In KA , B oland and L,S, Long eds,). stingStructural Equation Mod els. Newbury Park, CA: Sage, 1993, pp, 294-316,54 ,Kerlinger, F,N,Foundations of Behavioral Research.NewYork McGraw-Hill, 1978,55 ,Kim, J,; Lee, J,; Han, K,; and Lee, M, Businesses as buildings: Metrics for the architec-tural quality of Internet businesses.Information Systems Research, 13, 3 September 2002),239-254 ,56 , King, W,R, Management information systems. In H, Bidgoli ed,). Encyclopedia ofManagement Information Systems,vol, 3, New York: Academic Press,2003,57 , King, W,R,, and He, J, External validity, coverage and nonresponse errors in IS surveyresearch, Katz Graduate School of Business, University of Pittsburgh, 2004,58 ,Kohli, R,, and Devaraj, S, Measuring information technologypayoff A meta-analysis ofstructural variables in firm-level empirical rese arch.Information Systems Research, 14, 2 June2003), 127-145,59 , Kraemer, K,L,; Danziger, J,N,; Dunkle, D,E,; and King, J,L, The usefulness of com-puter-based information to public managers,M IS Quarterly,17, 2 June 1993), 129-148,60 ,Kraut, R,; Dum ais, S,; and Susan, K, Comp uterization, productivity, and quality of work-life.Communications of theACM 32, 2 February 1989), 220 -23 8,61 , Lacity, M,, and W illcocks, L,G lobal Information Technology Outsourcing.Chichester,UK: W iley, 200 1,62, Lambert, D,M,, and Harrington, T,C, Measuring nonresponse bias in customer servicemail surveys.Journal of Business Logistics, 11, 2 1990), 5-25,63 , Larsen, K,R,T, A taxonomy of antecedents of information systems success: Variableanalysis studies.Journal of Management Information Systems, 20, 2 Fall 2003), 169-246,64 , Lee, H, Knowledge management enablers, processes, and organizational performance:An integrative view and empirical examination.Journal of Management Information Systems,20, 1 Summer 2003), 179-228,65, Lee, H,; Kwak, W,; and Han, I, Developing a business performance evaluation system:An analytical hierarchical model.Engineering Economist, 40,4 Summer 1995), 34 3-3 57,66, M agal, S,R,; Carr, H,H,; and Watson, H,J, Critical success factors for information center

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    24/32

    1 08 JERRY CHA-JAN CHANG AND WILLIAM R. KING

    of adopting an infortnation technology innovation.Information Systems Research, 2, 3 Sep-tember 1991), 192-222.69. Myers, B.L.; Kappelman, L.A.; and Prybutok, V.R. A comprehensive model for assess-ing the quality and productivity of the information systems function: Toward a theory forinformation systems assessment. In E.J. Garrity and G.L. Sanders eds.).Information SystemsSuccess Measurement.Hershey, PA: Idea Group, 1998, pp.94-121.70. Nelson, K.M., and Cooprider, J.G. The contribution of shared knowledge to IS groupperformance. M ISQuarterly,20, 4 December 1996), 4 09 ^3 2.71.N elson, R.R., and C heney, P.H. Training end users: An exploratory study.M IS Quarterly,II, 4 December 1987), 54 7-5 59.72. Niederman, F.; Brancheau, J. C ; and Wetherbe, J.C. Information systems managementissues for the 1990s.M ISQuarterly,15, 4 December 1991), 47 5-50 0.73. Olian, J.D., and Rynes, S.L. Making total quality work: Aligning organizational pro-cesses, performance measures, and stakeholders. Human Resource Management, 30, 3 Fall1991), 303-333.74. Parasuraman, A.; Zeithaml, V.A.; and Berry, L.L. Refinement and reassessment of theSERVQUAL scale.Journal of Retailing, 64, 4 Winter 1991), 42 0-4 50.75. Pitt, L.F.; Watson, R.T.; and Kavan, C.B. Service quality: A measure of informationsystems effectiveness. MISQuarterly,19, 2 June 1995), 173-185.76.Porter, M .E., and Millar, V.E. How information gives you competitive advan tage. arvardBusinessReview, 63, 4 July-Augu st 1985), 149-16 0.77.Premkumar, G.Evaluation of Strategic Information Systems Planning: Em piricalVali-dation of a C onceptual Model. Pittsburgh: University of Pittsburgh, 1989.78. Raghunathan, B.; Raghunathan, T.S.; and Tu, Q. Dimensionality of the strategic gridframework: The construct and its measurement.Information Systems Research, 10, 4 Decem-ber 1999), 343-355.79. Rai, A.; Lang, S.S.; and Welker, R.B. Assessing the validity of IS success models: An

    empirical test and theoretical analysis.InformationSystems Research, 13 , March 2002), 50-69 .80 .Ryan, S.D., and Harrison, D.A. Considering social subsystem costs and benefit in infor-mation technology investment decisions: A view from the field on anticipated payoffs.Journalof Management Information Systems, 16, 4 Spring 2000) , 11 ^0 .81.Ryker, R., and Nath, R. An empirical examination of the impact of com puter informationsystems on users.Information Managem ent, 29, 4 1995), 207-21 4.82 . Saarinen, T. An expanded instrument for evaluating information system success.Infor-mation Managem ent, 31, 2 1996), 103-11 8.83 . Santhanam, R., and Hartono, E. Issues in linking information technology capability tofirm performance. M ISQuarterly, 27 , March 2003), 125-154.84 .Saunders, C.S., and Jones, J.W. M easuring performance of the information systems func-tion.Journal of Management Information Systems, 8, 4 Spring 1992), 63 -82 .85 . Schwab, D.P. Construct validation in organizational behavior. In B.M. Staw and L.L.Cummings eds.).Research in Organizational Behavior, vol. 2. Greenwich, CT: JAl Press,1980,pp. 3-43.86 .Scott, J.E. Facilitating interorganizational learning with information technology.Journalof Management Information Systems, 17, 2 Fall 2000), 81-1 14 .87. Seddon, P.B. A respecification and extension of the DeLon e and M cLean model of ISsuccess.Information Systems Research, 8, 3 September 1997), 240 -253.88 . Seddon, P.B.; Staples, S.; Patnayauni, R.; and Bowtell, M. Dimensions of informationsystems success.Com munications of the AIS, 2 November 1999), 2-3 9.89 .Segars, A.H. Assessing the unidimensionality of measurement: A paradigm and illustra-tion within the context of information systems research.Omega,25 , February 1997),107-121.90 . Segars, A.H., and Grover, V. Re-examining perceived ease of use and usefulness: Aconfirmatory factor analysis.M ISQu arterly,17, 4 December 1993), 51 7-5 25.91 .S egars, A.H ., and Grover,V.Strategic information systems planning su ccess: An investi-

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    25/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 109

    93 . Sethi, V., and King, W.R. Development of measures to assess the extent to which aninformation technology application provides competitive advantage. ManagementScience 4012 (December 1994), 1601-1627.94. Sethi, V., and King, W.R. Organizational Transformation Through Business ProcessRe-Engineering.Upper Saddle River, NJ: Prentice Hall, 1998.95 .S teers, R.M. Problems in the measurem ent of organizational effectiveness. Arfm(m.s/ra-tive Science Quarterly 20 4 (December 1975), 546 -55 8.96 . Straub, D.W.; Hoffman, D.L.; Weber, B.W.; and Steinfield, C Measuring e-commercein Net-enabled organizations: An introduction to the special issue. Information Systems Re-search 13 2 (June 2002), 115-124.97. Sussman, S.W., and Siegal, W.S. Informational influence in organizations: An inte-grated approach to knowledge adoption. Information Systems Research 14 1 (March 2003)247-65 .98 .Tallon, P.P.; Kraemer, K.L.; and Gurbaxani, V. Executives perceptions of the businessvalue of infonnation technology: A process-oriented approach. Journal of Managem ent Infor-

    mation Systems 16 4 (Spring 2000), 145-173.99 . Tam, K.Y. The impact of information technology investment on firm performance andevaluation: Evidence from newly industrialized economies. Information Systems Research 91 (March 1998), 85-98.100. Tem pleton, G.R; L ewis, B.R.; and Snyder, C A . D evelopment of a measure for theorganizational learning construct. Journal of Manag ement Information Systems 19 2 (Fall2002), 175-218.101. Teo, T.S.H. Integration between business planning and information systems planning:Evolutionary-contingency perspectives. Ph.D. dissertation. University of Pittsburgh, 1994.102. Torkzadeh, G., and D hiilon, G. M easuring factors that influence the success of Internetcommerce. Information Systems Research 13 2 (June 2002), 187-204.103. Torkzadeh, G., and Doll, W.J. The development of a tool for measuring the perceivedimpact of information technology on work. Omega Intemational Journal of ManagementScience 27 3 (June 1999), 32 7-33 9.104. Van Dyke , T. P; K appelma n, L.A.; and Prybutok, V.R. M easuring infonnation systemsservice quality: Concerns on the use of the SERVQUAL questionnaire.MIS Quarterly 21 2(June 1997), 195-208.105.Venkatesh, V; Morris, M.G.; Davis, G.B.; and D avis, F.D. User acceptance of informa-tion technology: Toward a unified view. M IS Quarterly 27 3 (September 2003), 425-47 8.106. Venkatraman, N. Strategic orientation of business enterprises: The construct, dimen-sionality, and measurement. Management Science 35 8 (August 1989), 942-962.107. Venkatraman, N., and Ramanujam, V Measurement of business economic performance:An examination of method convergence. yorna/o/Manageme/tr, 13 1 (1987), 109-122.108. Wand, Y, and Wang, R.Y. Anchoring data quality dimensions in ontological founda-tions. Communications ofth ACM 39 11 (November 1996), 86 -95 .109.Wang, R.Y , and Strong, D.M . Beyond accuracy: What data quality m eans to data con-sumers.Journal of Managem ent Information Systems 12 4 (Spring 1996), 5-34.110. Watson, R.T.; Pitt, L.F.; and Kavan, C.B . Measuring information systems service qual-ity: Lessons from two longitudinal case studies. M IS Quarterly 22 1 (March 1998), 61-7 9.111.Weill, P. The relationship between investment in information technology and firm per-formance: A study of the valve manufacturing sector. Information Systems Research 3 4 (De-cember 1992), 307-333.112. W ells, C E . Ev aluation of the MIS function in an organization: An exploration of themeasurement problem. Ph.D. dissertation. University of Minnesota, Minneapolis, 1987.113. Xia, W Dynam ic capabilities and organizational im pact of IT infrastructure: A research

    framework and empirical investigation. Ph.D. dissertation. University of Pittsburgh, 1998.

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    26/32

    1 10 JERRY CHA-JAN CHA NG AND WILLIAM R. KING

    Appendix. ISFS InstrumentW hat Is the IS F unction?THIS QUESTIONNAIRE IS DESIGNED TO ASSESS the performance of the informationsystems (IS) function in your organization. The IS function includes all IS individu-als, groups, and departments within the organization with whom you interact regu-larly. A s a user ofsom information systems/technology, you have your own definitionof what the IS function means to you, and it is the perform ance of you r IS functionthat should be addressed here.

    Effectiveness of InformationThe following statements ask you to assess the gener l ch r cteristics ofth inform tion that IS provides to you. Please trylo focus o n the d t nd inform tion itselfin giving the response that best represents your evaluation of each statement. If astatement is not applicable to you, circle 0.

    The extent that the inform tioni s:InterpretableUnderstandableCompleteClearConciseAccurateSecureImportantRelevantUsableWell organizedWell definedAvailableAccessibleUp-to-dateReceived in a timely mann erReliableVerifiableBelievable

    Hardlyat all

    1111111111111111111

    2222222222222222222

    3333333333333333333

    4444444444444444444

    To agreatextent

    5555555555555555555

    N/A0000000000000000000

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    27/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 111

    The extent that the informationCan be easily com pared to pastinformation.Can be easily m aintained.Can be easily changed.Can be easily integrated.Can be easily updated.Can be used for multiple purposes.Meets all your requirements.

    Hardlyat all

    1111111

    2222222

    3333333

    4444444

    To agreatextent

    5555555

    N/A

    0000000

    The following statements ask you to assess the outcomeofusin the information thatIS provided to you.

    To aHardly greatat all extent N/A

    The extent that:The amount of information isadequate 1 2 3 4 5 0It is easy to identify errors ininform ation 1 2 3 4 5 0It helps you discover newopportunities to serve customers.It is useful for defining problems.It is useful for making decisions.It improves your efficiency.It improves your effectiveness.It gives your company acompetitive edge.It is useful for identifying problems.

    1111111

    2222222

    3333333

    4444444

    5555555

    0000000

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    28/32

    1 12 JERRY CHA-JAN CHA NG AND WILLIAM R. KING

    IS Service PerfonnanceTbe following statements ask you to assess the performance ofservi es providedbytheIS departme nt or unction Please circle tbe number that best represents your evaluationof each statement. If a statement is not applicable to you, circle the number 0.

    Hardlyat all

    The extent that the:Training programs offered by tbeIS function are useful.Variety of training programsoffered by the IS function issufficient.IS function s services arecost-effective.Training programs offered by theIS function are cost-effective.IS function s services are valuable.IS function s services are helpful.

    The extent that tbe IS function:Responds to your service requestsin a timely manner.Completes its services in a timelymanner.Is dependable in providing services.Has your best interest at heart.Gives you individual attention.Has sufficient capacity to serve allits users.Can provide emergency services.Provides a sufficient variety ofservices.Has sufficient people to provideservices.Extends its systems/services to yourcustomers/suppliers.

    To agreatextent N/A

    11111

    Hardlyat all

    22222

    33333

    44444

    55555

    To agreatextent

    00000

    N/A

    111111

    222222

    333333

    444444

    555555

    000000

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    29/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 113

    The extent that IS people:Provide services for you promptly.Are dependable.Are efficient in performing theirservices.Are effective in performing theirservices.Have the knowledge and skill to dotheir job wellAre reliable.Are polite.Are sincere.Sbow respect to you.Are pleasant to work witb.Instill confidence in you.Are belpful to you.Solve your p roblems as if theywere their own.Understand your specific needs.Are w illing to help you.Help to make you a moreknowledgeable computer user.

    Hardlyat all

    1111111111111111

    2222222222222222

    3333333333333333

    4444444444444444

    To agreatextent

    5555555555555555

    N/A0000000000000000

    Systems PerfonnanceThe following statements ask you to assess tbe extent that systems produce variousoutcome s for you Tbe term systemsdoes not refer to the information itself Rather, itrefers to tbe capability to access, produce, manipulate, and present infonnation toyou e.g., to access data bases, or to develop a spreadsheet). Please circle the num bertbat best represents your evaluation ofe cbstatement. If statement is not applicableto you, circle 0.

    The extent that systems:Mak e it easier to do your job.Improve your job performance.

    Hardlyat all

    1] 22 33 44

    To agreatextent

    55

    N/A

    00

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    30/32

    1 14 JERRY CHA JAN CHANG AND WILLIAM R. KING

    Give you confidence toaccomphsh yourjob. 1 2 3 4 5 0Increase your product ivity. 1 2 3 4 5 0Increase your participation indecisions. 1 2 3 4 5 0Increase your aw areness ofjob related information. 1 2 3 4 5 0Improve the quality of yourw ork product. 1 2 3 4 5 0Enhance your problem solvingabihty. 1 2 3 4 5 0Help you manage relationshipswith external business partners.Improve customer satisfaction.Improve customer service.Enhance information sharing withyour customers/suppliers.Help retain valued customers.Help you select and qualify desiredsuppliers.Speed product delivery.Help you manage inbound logistics.Improve management control.Streamline work processes.Reduce process costs.Reduce cycle times.Provide you information fromother areas in the organization.Facilitate collaborative problemsolving.Facilitate collective groupdecision making.Facilitate your learning.Facilitate collective group learning.Facilitate knowledge transfer.Contribute to innovation.Facilitate knowledge utilization.

    111111111111

    222222222222

    33

    3333333

    444444444444

    555555555555

    000

    0000000

    111111

    222222

    333333

    444444

    555555

    000000

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    31/32

    MEASURING THE PERFORMANCE OF INFORMATION SYSTEMS 115

    The following statements ask you to assessgener l ch r cteristics of the inform tionsystems that you use regularly. Please circle the number that best represents yourevaluation of each statement. If a statement is not applicable to you, circle 0,

    The extent that:Systems have fast response time.System downtime is minimal.Systems are well integrated.Systems are reliable.Systems are accessible.Systems meet your expectation.Systems are cost-effective.Systems are responsive to meetyour changing needs.Systems are flexible.Systems are easy to use.System use is easy to learn.Your com pany s intranet is easyto navigate.It is easy to become skillful inusing systems.

    Hardlyat all

    1111111111111

    2222222222222

    3333333333333

    4444444444444

    To agreatextent

    5555555555555

    N/A0000000000000

  • 8/12/2019 Measuring the Performance of Information Systems: A Functional Scorecard

    32/32