research note effects of individual self-protection, …topic of privacy concerns (e.g., angst and...

23
This article was downloaded by: [130.203.152.80] On: 19 February 2014, At: 06:34 Publisher: Institute for Operations Research and the Management Sciences (INFORMS) INFORMS is located in Maryland, USA Information Systems Research Publication details, including instructions for authors and subscription information: http://pubsonline.informs.org Research Note—Effects of Individual Self-Protection, Industry Self-Regulation, and Government Regulation on Privacy Concerns: A Study of Location-Based Services Heng Xu, Hock-Hai Teo, Bernard C. Y. Tan, Ritu Agarwal, To cite this article: Heng Xu, Hock-Hai Teo, Bernard C. Y. Tan, Ritu Agarwal, (2012) Research Note—Effects of Individual Self-Protection, Industry Self-Regulation, and Government Regulation on Privacy Concerns: A Study of Location-Based Services. Information Systems Research 23(4):1342-1363. http://dx.doi.org/10.1287/isre.1120.0416 Full terms and conditions of use: http://pubsonline.informs.org/page/terms-and-conditions This article may be used only for the purposes of research, teaching, and/or private study. Commercial use or systematic downloading (by robots or other automatic processes) is prohibited without explicit Publisher approval. For more information, contact [email protected]. The Publisher does not warrant or guarantee the article’s accuracy, completeness, merchantability, fitness for a particular purpose, or non-infringement. Descriptions of, or references to, products or publications, or inclusion of an advertisement in this article, neither constitutes nor implies a guarantee, endorsement, or support of claims made of that product, publication, or service. Copyright © 2012, INFORMS Please scroll down for article—it is on subsequent pages INFORMS is the largest professional society in the world for professionals in the fields of operations research, management science, and analytics. For more information on INFORMS, its publications, membership, or meetings visit http://www.informs.org

Upload: others

Post on 18-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

This article was downloaded by: [130.203.152.80] On: 19 February 2014, At: 06:34Publisher: Institute for Operations Research and the Management Sciences (INFORMS)INFORMS is located in Maryland, USA

Information Systems Research

Publication details, including instructions for authors and subscription information:http://pubsonline.informs.org

Research Note—Effects of Individual Self-Protection,Industry Self-Regulation, and Government Regulation onPrivacy Concerns: A Study of Location-Based ServicesHeng Xu, Hock-Hai Teo, Bernard C. Y. Tan, Ritu Agarwal,

To cite this article:Heng Xu, Hock-Hai Teo, Bernard C. Y. Tan, Ritu Agarwal, (2012) Research Note—Effects of Individual Self-Protection, IndustrySelf-Regulation, and Government Regulation on Privacy Concerns: A Study of Location-Based Services. Information SystemsResearch 23(4):1342-1363. http://dx.doi.org/10.1287/isre.1120.0416

Full terms and conditions of use: http://pubsonline.informs.org/page/terms-and-conditions

This article may be used only for the purposes of research, teaching, and/or private study. Commercial useor systematic downloading (by robots or other automatic processes) is prohibited without explicit Publisherapproval. For more information, contact [email protected].

The Publisher does not warrant or guarantee the article’s accuracy, completeness, merchantability, fitnessfor a particular purpose, or non-infringement. Descriptions of, or references to, products or publications, orinclusion of an advertisement in this article, neither constitutes nor implies a guarantee, endorsement, orsupport of claims made of that product, publication, or service.

Copyright © 2012, INFORMS

Please scroll down for article—it is on subsequent pages

INFORMS is the largest professional society in the world for professionals in the fields of operations research, managementscience, and analytics.For more information on INFORMS, its publications, membership, or meetings visit http://www.informs.org

Page 2: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Information Systems ResearchVol. 23, No. 4, December 2012, pp. 1342–1363ISSN 1047-7047 (print) � ISSN 1526-5536 (online) http://dx.doi.org/10.1287/isre.1120.0416

© 2012 INFORMS

Research Note

Effects of Individual Self-Protection,Industry Self-Regulation, and Government

Regulation on Privacy Concerns:A Study of Location-Based Services

Heng XuCollege of Information Sciences and Technology, The Pennsylvania State University, University Park, Pennsylvania 16802,

[email protected]

Hock-Hai TeoDepartment of Information Systems, National University of Singapore, School of Computing, Singapore 117418,

[email protected]

Bernard C. Y. TanDepartment of Information Systems, National University of Singapore, School of Computing, Singapore 117418,

[email protected]

Ritu AgarwalRobert H. Smith School of Business, University of Maryland, College Park, Maryland 20742, [email protected]

This study seeks to clarify the nature of control in the context of information privacy to generate insights intothe effects of different privacy assurance approaches on context-specific concerns for information privacy.

We theorize that such effects are exhibited through mediation by perceived control over personal informationand develop arguments in support of the interaction effects involving different privacy assurance approaches(individual self-protection, industry self-regulation, and government legislation). We test the research model inthe context of location-based services using data obtained from 178 individuals in Singapore. In general, theresults support our core assertion that perceived control over personal information is a key factor affectingcontext-specific concerns for information privacy. In addition to enhancing our theoretical understanding of thelink between control and privacy concerns, these findings have important implications for service providersand consumers as well as for regulatory bodies and technology developers.Key words : privacy; context-specific concerns for information privacy; psychological control; control agency;

individual self-protection; industry self-regulation; and government regulationHistory : Elena Karahanna, Senior Editor; Mariam Zahedi, Associate Editor. This paper was received on

August 20, 2008, and was with the authors 24 months for 4 revisions. Published online in Articles in AdvanceApril 18, 2012.

1. IntroductionInformation privacy is an increasingly critical concernfor many individuals around the world. The globaldiffusion of mobile technologies and the unboundedoptions for gathering, storing, processing, disseminat-ing, and exploiting personal information trigger con-sumer concerns (FTC 2010). Over the past decade,scholars in information systems have examined thetopic of privacy concerns (e.g., Angst and Agarwal2009, Bansal et al. 2010, Dinev and Hart 2006, Malho-tra et al. 2004, Son and Kim 2008) related to the collec-tion and use of personal information from a varietyof different perspectives. A general conclusion fromthis stream of research is that individuals would resist

online transactions or adoptions of new technologiesin the presence of significant privacy concerns.

It has been posited that individuals tend to havelower privacy concerns if they perceive a certaindegree of control over the collection and use oftheir personal information (Dinev and Hart 2004,Xu et al. 2008). Considering the potential impor-tance of control in the context of information privacy,this study explores and empirically demonstratesthe contribution of the psychological control theo-ries to understanding information privacy. We con-ducted an experimental study in the specific contextof location-based services (LBS). Three privacy assur-ance approaches (individual self-protection, indus-try self-regulation, and government legislation) were

1342

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 3: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy ConcernsInformation Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS 1343

manipulated in the experiment and their effectson control perceptions and privacy concerns wereexamined.

The current study contributes to existing pri-vacy research in several important ways. First, priorresearch has shown a lack of clarity in explicatingthe link between control and privacy: some studieshave defined privacy as control per se (e.g., Goodwin1991, Milne and Rohm 2000); others have positionedcontrol as a key factor shaping privacy (e.g., Lauferand Wolfe 1977, Dinev and Hart 2004). There havebeen very few attempts to bring control theories intoprivacy research to clarify the nature of control. Thecurrent research contributes to this controversial issueby explicating this control—privacy contention. Sec-ond, following the call by Margulis (2003a, b), cur-rent research has explicitly integrated the literature onpsychological control into theories of privacy. Towardthis end, we have adopted the control agency the-ory to make a theoretical distinction of different pri-vacy assurance approaches by explicitly linking themwith different types of control agencies. Finally, mostexisting privacy research focuses on examining theindividual effect of privacy assurance approaches(e.g., Culnan and Bies 2003, Metzger 2006). Build-ing on the control agency framework, this researchextends the literature by proposing interaction effectsof these privacy assurance approaches on alleviatingprivacy concerns, based on the type of control agen-cies they can provide.

In what follows, we first provide an overview ofprior relevant literature to establish a theoretical foun-dation for studying privacy, privacy concerns, control,and privacy assurance approaches. Then we developthe logic underlying the research hypotheses thatrelate different privacy assurance approaches to pri-vacy concerns, mediated by perceived control. This isfollowed by a description of the research methodol-ogy and findings. The paper concludes with a discus-sion of the key results, directions for future research,and the practical implications of the findings.

2. Theoretical Development2.1. The Concept of Privacy and the Construct of

Privacy ConcernsAlthough various conceptions of privacy have beengiven in the literature, “the notion of privacy isfraught with multiple meanings, interpretations, andvalue judgments” (Waldo et al. 2007, p. x). The richbody of conceptual work has welcomed research tosynthesize various conceptions and identify commonground. For example, Solove (2007) describes pri-vacy as “a shorthand umbrella term” (p. 760) fora set of privacy problems resulting from informa-tion collection, information processing, information

dissemination, and invasion activities. Culnan andWilliams (2009) argue that these activities on infor-mation reuse and unauthorized access by organiza-tions “can potentially threaten an individual’s abilityto maintain a condition of limited access to his/herpersonal information” (p. 675). Solove’s groundwork(2007) for a pluralistic conception of privacy differen-tiates the concept of privacy (as an individual state)from the management of privacy (arising from organi-zational information processing activities). From theperspectives of individuals (Dhillon and Moores 2001,Schoeman 1984), we define privacy as a state of lim-ited access to personal information.

Another challenge faced in understanding privacyis the diversity of measurements used by priorresearch. Measures that have been adopted for theexaminations of privacy include attitudes toward pri-vacy (e.g., Buchanan et al. 2007), concerns for privacy(e.g., Smith et al. 1996), and privacy-related behav-ioral intentions (e.g., Dinev and Hart 2006). Based onan extensive review of privacy literature, Smith et al.(2011) note that the variable of “privacy concerns”becomes the central construct within IS research andit has become the proxy to operationalize the con-cept of “privacy.” Specifically, the Smith et al. (1996)Concern for Information Privacy (CFIP) scale hasbeen considered as one of the most reliable scalesfor measuring privacy concerns, using four data-related dimensions: collection, errors, unauthorizedsecondary use, and improper access to information.As Bansal et al. (2008) note, these four dimensions ofCFIP supported the underlying definitions of the U.S.Federal Trade Commission’s (FTC) fair informationpractices (FTC 2000), which include the stipulationsthat consumers be given notice that their personalinformation is being collected (mapped as collection ofCFIP), consent with regard to the authorized use oftheir information (mapped as unauthorized secondaryuse of CFIP), access to personal data records to assuredata accuracy (mapped as errors of CFIP), and secu-rity to prevent these data records from unauthorizedaccess (mapped as improper access of CFIP).

More recently, Malhotra et al. (2004) develop amultidimensional scale of Internet Users InformationPrivacy Concerns (IUIPC), which adapted the instru-ment of CFIP into the Internet context. Based on thesocial contract and justice theories, IUIPC identifiedthree dimensions of Internet privacy concerns: collec-tion of personal information (rooted in the distribu-tive justice theory), control over personal information(rooted in the procedural justice theory), and aware-ness of organizational privacy practices (rooted inthe interactional and information justice theory). In thisresearch, we ground our work in the CFIP instead ofthe IUIPC. This is because, compared to IUIPC’s focuson an individual’s subjective view of fairness, CFIP’s

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 4: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy Concerns1344 Information Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS

focus on the organizational privacy practices natu-rally maps with the FTC’s fair information practices,which serve as the privacy standard used in theUnited States to address consumer privacy concerns.

2.2. General vs. Context-Specific Concerns forInformation Privacy

In the IS field, most positivist studies on privacyhave operationalized the construct of privacy con-cerns from two broad perspectives: some have treatedit as a general concern of worries over possible loss ofinformation privacy across contexts (e.g., Awad andKrishnan 2006, Bellman et al. 2004, Dinev and Hart2006, Smith et al. 1996); others have approached it as acontext-specific concern for information privacy regard-ing particular websites or technologies (e.g., Bansalet al. 2008, Junglas et al. 2008, Oded and Sunil 2009,Pavlou et al. 2007). Emphasizing the role of contextualfactors in shaping privacy beliefs, scholars in com-puter science (Ackerman and Mainwaring 2005), law(Solove 2006), and sociology (Margulis 2003a) havenoted that it is important to theoretically draw adistinction between general concerns for privacy andcontext-specific concerns. For example, Ackerman andMainwaring (2005) point out that people’s expecta-tions and problems concerning privacy may all differwhen moving among areas of computation and tasks.What may be a privacy concern in healthcare web-sites may be a very different problem for users thanin social networking websites.

Therefore, we argue that these two types of privacyconcerns (general versus context-specific concerns)have distinct characteristics: Individuals’ general con-cerns for information privacy reflect their inher-ent needs and attitudes toward maintaining privacy,which are conceived to be more stable across domainsor contexts. Context-specific concerns for informationprivacy, on the other hand, tie the individuals’ assess-ments of privacy concerns to a specific context with aspecific external agent, demanding that consumers beinvolving in a dynamic assessment process in whichtheir privacy needs are evaluated against their infor-mation disclosure needs are weighed against infor-mation disclosure needs (Sheehan 2002). In terms ofthe relationship between these two types of privacyconcerns, Li et al. (2011) have suggested that theeffect of general privacy concerns may be overriddenby context-specific privacy concerns because of theimpact of contextual factors associated with a specificwebsite and its information collection activities. In theIS field, Malhotra et al. (2004, p. 349) have made anexplicit call for research on examining privacy con-cerns “at a specific level”:

0 0 0privacy research in the IS domain has paid little atten-tion to consumers’ perceptions specific to a particularcontext 0 0 0 0 However, our findings clearly reveal that tohave a complete understanding of consumer reactions to

information privacy-related issues, researchers should exam-ine not only consumers’ privacy concerns at a general level,but also consider salient beliefs and contextual differences ata specific level.

Following this call for examining privacy concernsat a specific level, we focus on context-specific asopposed to general concerns for information privacyin this research. Context has been defined as “stim-uli and phenomena that surround and, thus, existin the environment external to the individual, mostoften at a different level of analysis” (Mowday andSutton 1993, p. 198, as cited in Smith et al. 2011).We consider contexts to incorporate different tasks oractivities that consumers may undertake with differ-ent types of companies or websites (e.g., specific ven-dors, online pharmacies, and social networking sites)as well as different types of information collectedfrom consumers (e.g., behavioral, location, financial,health, and biographical). We argue that privacy con-cerns are context-specific, based in the specifics of bywhom, why, when, and what type of personal infor-mation is being collected, distributed, and used. Con-cerns for information privacy in any specific contextcan be circumscribed by the elements within thesespecifics. In this research, we adapted the measuresof CFIP from a general level to a context-specific leveland define context-specific CFIP as “consumers’ con-cerns about possible loss of privacy as a result ofinformation disclosure to a specific external agent” (inour case, a service provider) (Xu et al. 2011, p. 800).

2.3. Psychological Perspective of ControlAlthough the element of control is embedded in mostprivacy definitions (e.g., Altman 1976, Goodwin 1991,Johnson 1974) and has been used to operationalizeprivacy in measurement instruments (e.g., Malhotraet al. 2004), its meaning has been interpreted differ-ently (see Table 1). For example, control has been usedto refer to various targets such as the choice to opt outof an information exchange (Milne and Rohm 2000),ability to affect the dissemination and use of per-sonal information (Phelps et al. 2000), and secondary(indirect) control over the attainment of privacy-related outcomes (Johnson 1974). As Margulis (2003b)pointed out, the very nature of control is ambigu-ous in the context of information privacy. Similarly,Solove (2002) noted that “theorists provide little elab-oration as to what control really entails, and it is oftenunderstood too narrowly or too broadly” (p. 1112).Margulis (2003a) also pointed out that privacy the-orists had failed to adequately define the nature ofcontrol in their theories of privacy. It seems that thefrequent link between control and privacy has notcontributed as much to clarifying privacy issues asit should have. In response to the call for a strongertheoretical basis for research on information privacy,

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 5: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy ConcernsInformation Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS 1345

Table 1 Frequent Linkage of Privacy and Control

Authors Conceptualization of privacy Role of control Meaning of control Nature of control

Altman (1976) Privacy is conceptualized as selectivecontrol of access to the self or toone’s group.

Privacy is defined asinterpersonal control.

Control is conceptualized as an activeand dynamic regulatory process.

Behavioral

Caudill and Murphy (2000) Privacy is defined as consumers’control of their information in amarketing interaction and thedegree of their knowledge of thecollection and use of their personalinformation.

Control is one dimensiondefining privacy.

Control refers to the ability to decidethe amount and depth ofinformation collected (i.e., throughopt-in and opt-out options).

Behavioral

Culnan (1993) Privacy is defined as the ability of anindividual to control the accessothers have to personalinformation.

Control is a core dimensionunderlying attitudes towardprivacy.

Control refers to the ability ofindividuals to decide how theirinformation is reused. Loss ofcontrol is operationalized as adimension of privacy concerns.

Behavioral

Dinev and Hart (2004) Privacy is defined as the right todisclose information about oneself.

Perceived ability to control isan antecedent to privacyconcern.

Control refers to the mechanism awebsite provides for consumers tocontrol their submitted personalinformation.

Behavioral

Goodwin (1991) Privacy is defined based on twodimensions of control: control ofinformation disclosure and controlover unwanted intrusions into theenvironment.

Privacy is defined as control. Control refers to the ability to decide(a) the presence of other people inthe environment and (b)dissemination of informationrelated to or provided during atransaction.

Behavioral

Hoadley et al. (2010) Privacy is conceptualized asindividuals’ perceived control oftheir information in an online socialinteraction and the ease ofinformation access by others.

Illusory loss of control canlead to privacy concern.

Control is conceptualized as apsychological perception (insteadof actual controllability), andillusory loss of control, promptedby the interface change, can triggerusers’ privacy concerns.

Psychological

Johnson (1974) Privacy is defined as secondarycontrol in the service ofneed-satisfying outcomeeffectance.

Privacy is defined asbehavioral selectioncontrol.

Control is conceptualized with twodimensions: primary (direct) andsecondary (indirect) personalcontrol over the attainment ofprivacy-related outcomes.

Behavioral

Laufer and Wolfe (1977) Privacy is tied to concrete situationswith three dimensions: self-ego,environmental, and interpersonal.

Control is a mediating variablein the privacy system.

Control refers to the ability to choosehow, under what circumstances,and to what degree the individual isto disclose information.

Behavioral

Malhotra et al. (2004) Privacy is defined as the claim ofindividuals, groups, or institutionsto determine for themselves when,how, and to what extentinformation about them iscommunicated to others.

Control is viewed as adimension of privacyconcern.

Control refers to individuals’ ability todecide how their information iscollected, used, and shared.

Behavioral

Milne and Rohm (2000) Privacy is defined as a state, on thebasis of who controls consumerdata and whether consumers areinformed of information collectionand privacy rights.

Control is one dimensiondefining privacy.

Control refers to the ability to removenames from marketing list (i.e.,through opt-out mechanism).

Behavioral

Phelps et al. (2000) Privacy refers to the ability to affectthe dissemination and use ofpersonal information and controlover unwanted use.

Control is one dimensiondefining privacy and is anantecedent to concern forfirms’ informationpractices.

Control refers to the ability toinfluence how personal informationis used and who will have accessto it.

Behavioral

Zweig and Webster (2002) Privacy is implicitly defined as theextent to which people can controlthe release and dissemination ofpersonal information.

Control is an antecedent toperceived invasion ofprivacy.

Control is operationalized as thetechnical ability to control overwhen one’s video image isdisplayed.

Behavioral

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 6: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy Concerns1346 Information Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS

this study explores and empirically demonstrates thecontribution of psychological control perspective tothe understanding of privacy concerns.

As shown in Table 1, there has been movementtoward the interpretation of control as the abilityof consumers to voice or exit in order to influencechanges in organizational privacy practices they findto be objectionable (Malhotra et al. 2004). For exam-ple, Stewart and Segars (2002, p. 40) noted that “whileconsumers’ privacy concerns may relate to very spe-cific information practices such as collection, sec-ondary use, access, and errors, the supra concern thataccounts for the interdependencies among these fac-tors may be the degree of control over their personalinformation that is retained by the consumer.” Suchfocus on actual control in the IS literature, however,excludes those aspects of psychological control thatmay not directly involve behavioral attempts to effecta change. Hoadley et al. (2010) echoed the impor-tance of psychological control by noting that privacyis not simply related to the factual state of informa-tion disclosure, access, and use (i.e., zeros-and-onesof data privacy). In their analysis of the event of theFacebook News Feed privacy outcry, Hoadley et al.(2010) pointed out that “no privacy (from a zeros-and-ones perspective) was compromised due to theintroduction of the feed features” (p. 57), because“Facebook’s old and new interfaces are isomorphic”(p. 55) in terms of the actual controllability over whohad access to what information. Yet the News Feedfeatures “induce lower levels of perceived control overpersonal information due to easier information access,which in turn leads to a subjectively higher perceptionof privacy intrusion” (Hoadley et al. 2010, p. 57).

The above discussions indicate that the perceivedloss of control over personal information may be afunction of not only objective reality but also “theindividual’s subjective beliefs, vicarious observations,and biases” (Hoadley et al. 2010, p. 57). “Veridical-ity is not necessary or sufficient to bring about theperception of control, although the perception of con-trol, however illusory, may have a profound effecton the individual” (Wallston 2001, p. 49, as cited inHoadley et al. 2010). The case of the Facebook NewsFeed privacy outcry has demonstrated how an “illu-sory” loss of control, prompted by the introductionof News Feed features, triggered users’ privacy con-cerns. To provide a richer conceptual description ofcontrol, this study explores and empirically demon-strates the contribution of psychological control the-ories to the understanding of control in the privacycontext.

In the psychology literature, control is commonlytreated as a perceptual construct because perceivedcontrol affects human behavior much more thanactual control (Skinner 1996). Perceived control has

been generally defined as an individual’s beliefsabout the presence of factors that may facilitate orimpede performance of the behavior (Ajzen 2001).Being cognitive in nature, perceived control is sub-jective and need not necessarily involve attempts toeffect a behavioral change (Langer 1975). That is tosay, perceived control may be based on the indi-vidual’s evaluation of the objective reality (i.e., theresources and opportunities facilitating personal con-trol), or it might be based on the individual’s attemptsto “give up control” to someone who is more ablethan oneself to produce the desired outcome (Miller1980). In this research, we define perceived controlover personal information as an individual’s beliefabout the presence of factors that may increase ordecrease the amount of control over the release anddissemination of personal information.

2.4. Privacy Assurance Approaches: A ControlAgency Perspective

In the privacy literature, control has been under-stood as a form of information ownership (Westin1967) conceptualized as the individual choice to optin to or opt out of an electronic information exchangeenvironment completely or selectively (Caudill andMurphy 2000), or operationalized as the technicalability to control the information display (Zweig andWebster 2002). This body of literature’s focus on indi-vidual control, however, makes it too narrow a concep-tion because it excludes those aspects of control thatare beyond individual choice. The follow-up ques-tion is whether individuals are able to execute signif-icant information control in all circumstances, givendiscrepancies in awareness and power in the pro-cess of data gathering and transfer (Schwartz 1999).The implication is that privacy assurance “is notjust a matter for the exercise of individual actionsbut also an important aspect of institutional struc-ture” (Xu et al. 2011, p. 799). As Solove (2002)noted, “[P]rivacy 0 0 0 is not simply a matter of individ-ual prerogative; it is also an issue of what societydeems appropriate to protect” (p. 1111). To providea richer conceptual description of privacy assurance,this research demonstrates the contribution of controlagency theory to the understanding of privacy assur-ance approaches. In particular, the control agencytheory allows us to not only examine the effects ofpersonal control, in which the self acts as the controlagent to protect privacy, but also include proxy control,in which powerful others (such as the governmentand industry regulators) act as the control agents toprotect privacy.

Two paths to enhancing control perceptions can beidentified from the control agency perspective, whichdifferentiates control conceptually among its various

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 7: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy ConcernsInformation Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS 1347

components. First, perceived control can be ampli-fied by having direct personal control, where the con-trol agent is the self (Bandura 2001, Skinner 1996).Individuals who value personal agency would preferexercising direct personal control because they “wouldespecially feel themselves more self-efficacious whentheir agency is made explicit” (Yamaguchi 2001,p. 226). Personal agency suggests that individuals aremotivated to act upon opportunities that allow themto be the sole initiator of their behavior (Bandura2001). Second, perceived control can be increased byhaving proxy control, where the control agent is pow-erful others (Bandura 2001, Yamaguchi 2001). Withproxy control, individuals attempt to align themselvessuch that they are able to gain control through power-ful others. In proxy agency, “people try by one meansor another to get those who have access to resourcesor expertise or who wield influence and power to actat their behest to secure the outcomes they desire”(Bandura 2001, p. 13).

The privacy literature describes three majorapproaches to help protect information privacy: indi-vidual self-protection, industry self-regulation, andgovernment legislation (Culnan and Bies 2003, Sonand Kim 2008, Tang et al. 2008, Xu et al. 2010).These approaches fall into two generic categoriesbased on the type of control agency they provide: per-sonal control enhancing mechanism and proxy controlenhancing mechanism. The former control enhanc-ing mechanism (via individual self-protection) com-prises tools and approaches that enable individualsto directly control the flow of their personal infor-mation to others. As is evident with individual self-protection, the agent of control is the self and theeffect of this agency arises due to the opportunityfor direct personal control. The latter mechanism(via industry self-regulation or government legisla-tion) refers to the institution-based approaches wherepowerful forces (i.e., government legislator or indus-try self-regulator) act as the control agents for con-sumers to exercise proxy control over their personalinformation.

In general, the public has been skeptical aboutthe efficacy of individual self-protection and indus-try self-regulation for protecting information privacy(Culnan 2000, Tang et al. 2008, Turner and Dasgupta2003). As a result, privacy advocates continue todemand stronger government regulation to restrainabuse and mishandling of personal information bycompanies (Culnan 2000, Swire 1997). Prior researchon privacy assurance approaches has focused onhow these approaches individually influence privacybeliefs and privacy related constructs such as privacyconcerns, trust, or perceived risks (Hui et al. 2007,Son and Kim 2008, Tang et al. 2008). The direct effects

of these privacy assurance approaches on influenc-ing privacy related beliefs are intuitively appealingand have been indirectly supported in prior research.Our study is particularly interested in the interac-tion effects, i.e., how interactions of these privacyassurance approaches may collectively influence pri-vacy concerns through the effect on perceived control.Because these mechanisms do not appear in isolationin practice, their individual effects are less likely tobe ecologically informative than are their interactioneffects.

2.5. Privacy Concerns Pertaining toLocation-Based Services

In contrast to most privacy research that was con-ducted in the conventional Web context (e.g., Bansalet al. 2010, Dinev and Hart 2006, Son and Kim 2008),we develop and empirically test a research modelin an understudied context of location-based services(LBS). LBS have a number of characteristics that makethem particularly suitable for current research. First,positioning systems are likely to endure as an impor-tant technology because of the significant investmentsmade in their development and associated telecom-munication infrastructure (ABI 2011, FTC 2009). Rep-utable firms like Google, Yahoo, and Facebook aswell as startups like Foursquare, Gowalla, Loopt, andmany others are entering the market of LBS. Con-sequently, LBS that utilize spatial location informa-tion to provide value-added services to users willbecome a prevalent phenomenon globally (ABI 2011).Second, compared with the Internet environment,the ubiquitous computing environment offers indi-viduals higher potential for control over communica-tion and exchange of personal information (Junglasand Watson 2006). Therefore, understanding the linkbetween perceived control and privacy concerns hasbecome much more important.

Finally, in a context characterized by ubiquity anduniqueness where consumers engage more activitiesthat collect, analyze, and visualize personal infor-mation, privacy concerns have become particularlysalient (ABI 2011). As highlighted in the FTC’s recenttown hall meeting on the mobile marketplace, theuse of LBS often discloses the goegraphical loca-tion information of a user in real time, renderingthe potential for privacy intrusions very significant(FTC 2009). These concerns pertain to the mobiledevices’ automatic generation and transmission ofconsumers’ location information; the confidentialityof accumulated personal data (e.g., their location data,identity, and behavior data); and the potential riskthat individuals would experience with a breach ofconfidentiality (FTC 2009). To the degree that that pri-vacy concerns represent a major inhibiting factor inthe adoption of LBS (ABI 2011), it is important tounderstand how such concerns can be addressed.

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 8: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy Concerns1348 Information Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS

Figure 1 Research Model

Perceived controlover personal

information (CTL)

Context-specificconcerns forinformation

privacy (CFIP )

Individualself-protection (ISP )

Personal controlagency

Industryself-regulation (REG)

Governmentlegislation (LEG)

Proxy controlagency

H3

Control variables• Age• Gender• Education• Desire for information control• Trust propensity• Privacy experience

H1

H2

Not specifically hypothesized but path included for statistical testing

3. Research Model and HypothesisDevelopment

Figure 1 presents the research model. In sum,the theoretical foundation for this study is cen-tered on the control agency theory (Bandura 2001,Yamaguchi 2001), which suggests that mechanismstriggering control perceptions over personal informa-tion can be instrumental in alleviating privacy con-cerns. By explicitly associating different mechanismswith different forms of control agency, the interactioneffects of these mechanisms can be predicted. In thefollowing sections, we present arguments for whyeach mechanism is expected to enhance perceivedcontrol. Then the research hypotheses are constructedaround interactions between the personal and proxycontrol enhancing mechanisms.

3.1. Effects of Perceived Control onContext-Specific Privacy Concerns

Although control has received attention as the com-mon core of definitions of privacy (e.g., Awad andKrishnan 2006, Son and Kim 2008), researchers in lawand social science have noted that it is important totreat these two concepts as separate and supportingconcepts (Margulis 2003a, b; Solove 2002). Waldo et al.(2007) argued that “control over information cannotbe the exclusive defining characteristic of privacy”(p. 61) and privacy is more than control. Similarly,DeCew (1997) pointed out that “we often lose con-trol over information in ways that do not involvean invasion of our privacy” (p. 53). Consistent withthe emphasis in the law and social science litera-ture on the relationship between control and privacy,we argue in this research that control is a key variablein determining privacy state but privacy is not control

per se. This distinction enables us to avoid conflatingthe concept of privacy with the concept of control,which is used to justify the effects of privacy assur-ance approaches through different control agencies.

Prior privacy research suggests that consumerstend to have lower levels of privacy concerns whenthey believe that they have control over the disclo-sure and subsequent use of their personal informa-tion in a specific situation (Culnan and Armstrong1999, Culnan and Bies 2003, Phelps et al. 2000).In other words, perceived control helps reduce peo-ple’s context-specific privacy concerns if they per-ceive current and future risks as low (Milne andCulnan 2004). Empirical evidence has revealed thatconsumers’ control perceptions over collection anddissemination of personal information are negativelyrelated to privacy concerns regarding particular web-sites (Xu et al. 2008). These considerations suggestthat in a specific context, individuals tend to havelower privacy concerns if they get a sense of greatercontrol over their personal information that is col-lected and used by a specific company. Following thisperspective, we propose that perceived control overpersonal information is negatively related to context-specific privacy concerns.

Hypothesis 1 (H1). Perceived control over personalinformation has a negative effect on the context-specificconcerns for information privacy.

3.2. Effects of Privacy Assurance Approaches onPersonal Control

3.2.1. Personal Control Through Individual Self-Protection. Individuals are striving for primary con-trol over their environment when they exercise

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 9: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy ConcernsInformation Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS 1349

personal control through individual self-protectionactions (Weisz et al. 1984). Such a mechanismempowers individuals with direct control over howtheir personal information may be gathered byservice providers. The privacy literature (Culnanand Bies 2003, Milne and Culnan 2004, Son andKim 2008) describes two major types of individ-ual self-protection approaches: nontechnological andtechnological approaches. An array of nontechnolog-ical self-protection approaches has been discussed interms of reading privacy policies, refusal to revealpersonal information, misrepresentation of personalinformation, removal from mailing lists, negativeword-of-mouth, complaining directly to the onlinecompanies, and complaining indirectly to third-partyorganizations (see Son and Kim 2008 for a review).In the context of this research, we focus on examin-ing technological self-protection approaches becausethese technological privacy assurances have beenunderstudied in the IS field.

Technological self-protection approaches compriseprivacy-enhancing technologies (PETs) that allowindividuals to protect their information privacyby directly controlling the flow of their personalinformation to others (Burkert 1997). PETs are quitenumerous, with technologies such as anonymous websurfing and communication tools, cookie manage-ment tools, and the Platform for Privacy Preference(P3P) and its user agents (Cranor 2002). In the contextof online social networks, to assuage user perceptionsof privacy invasions, a number of social networkingsites have been rolling out privacy-enhancing featuresthat provide users with the capabilities to limit infor-mation disclosure and access (Hoadley et al. 2010).These privacy enhancing features can provide userswith means to control the disclosure, access, and useof their personal information and thus may increasethe level of perceived control. In the specific contextof LBS, users are given PETs to turn off the locationtracking from their mobile devices (Barkhuus et al.2008, Tsai et al. 2010). Some devices also allow usersto specify the accuracy of location information to bereleased to LBS providers (Barkhuus et al. 2008, Tsaiet al. 2010). Hence, having mobile devices with PETsshould facilitate individuals’ beliefs that they can exe-cute direct control over their personal information inthe context of LBS.

3.2.2. Proxy Control Through Industry Self-Regulation. Industry self-regulation is a commonlyused approach that mainly consists of industrycodes of conduct and self-policing trade groupsand associations as a means of regulating privacypractices. In practice, third-party intervention hasbeen employed to provide trustworthiness to com-panies through membership of self-policing associa-tions (e.g., Direct Marketing Association) or privacy

seals (e.g., TRUSTe)1 that are designed to confirmsufficient privacy assurance (Culnan and Bies 2003).An example of an industry self-regulator is theCellular Telecommunications and Internet Associa-tion (CTIA), which has established guidelines forLBS providers to handle personal information linkedto location (CTIA 2008). Other examples, includinggroups such as TRUSTe, have prescribed responsi-ble privacy practices and implementation guidelinesfor LBS providers to safeguard private information(TRUSTe 2004).

Prior research has shown that companies thatannounce membership in self-policing associations orseals of approval foster consumers’ perceptions ofcontrol over their personal information (Xu et al.2008). Further, failure to abide by the terms of self-policing associations or seals of approval can meantermination as a licensee of the program or “referral tothe appropriate law authority, which may include theappropriate attorney general’s office, the FTC, or theIndividual Protection Agency” (Xu et al. 2011, p. 806).Thus these self-regulatory structures create incentivesfor companies to refrain from behaving opportunisti-cally (Tang et al. 2008). Hence in the context of LBS,we argue that having seals like TRUSTe certifyinga firm’s privacy practices may facilitate individuals’beliefs that they are able to execute proxy control overtheir personal information.

3.2.3. Proxy Control Through Government Reg-ulation. According to Xu et al. (2010), “governmentregulation is another commonly used approach thatrelies on the judicial and legislative branches ofa government for protecting personal information(p. 143).” Legal action was taken by the EuropeanCommission in the Directive on privacy and elec-tronic communications (2002/58/EC)2 that explicitlyrequires location information to be used only withthe consent of individuals and only for the dura-tion necessary to provide the specific services. Thisdirective further requires that individuals be providedwith simple means to temporarily deny the collectionand use of their location information. In the UnitedStates, the Location Privacy Act of 20113 was draftedto safeguard user location information by requiringany nongovernmental entities to obtain user consentbefore collecting or distributing the location data. Themain focus of this bill is to obtain consent from users

1 See Direct Marketing Association at http://www.the-dma.org andTRUSTe at http://www.truste.org/ for examples.2 See Directive 2002/58/EC of the European Parliament and ofthe Council of 12 July 2002: http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2002:201:0037:0037:EN:PDF.3 Location Privacy Protection Act of 2011, S. 1223, 112th Cong.§3 (2011): http://franken.senate.gov/files/docs/Location_Privacy_Protection_Act_of_2011_Bill_Text.pdf.

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 10: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy Concerns1350 Information Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS

of mobile devices before their locations are collectedand shared with commercial entities or third partiessuch as advertisers.4

With government regulation, individuals canrelinquish personal control and instead allow the leg-islation to exercise proxy control (on their behalf)to protect their personal information. Spiro andHoughteling (1981) indicated that the legal systemis the most powerful approach for the execution ofproxy control because it requires that offenders bepunished in order to maintain its deterrent effective-ness. The legislation can decree the types of personalinformation companies and third parties are allowedto collect from consumers as well as the ways that col-lected information should be protected against misuseand breach (Swire 1997). Through enforcement agen-cies, regulations specify rules and provide redress toindividuals who are harmed when the law is violated(Culnan and Bies 2003). Thus we argue that throughthe government legislation, individuals can exerciseproxy control over the collection and use of their per-sonal information by service providers in the contextof LBS.

3.2.4. Interaction Effects Between Personal andProxy Control. Thus far, we have argued that eachof the three privacy assurance approaches can poten-tially alleviate privacy concerns via their effects on thelevel of perceived control. The direct effects are intu-itively appealing and have been indirectly supportedin prior research. For example, regarding the directeffect of individual self-protection through PETs, priorresearch has indicated that having mobile deviceswith PETs to specify users’ privacy preferences makesusers feel more in control of their personal informa-tion and thus alleviates their privacy concerns per-taining to LBS (e.g., Barkhuus and Dey 2003). Forthe industry self-regulation approach, empirical evi-dence has revealed that having third-party privacyseals certifying a firm’s privacy practices can increaseconsumers’ control perceptions (e.g., Xu et al. 2008).Under the government regulation approach, researchhas suggested that the privacy protection standardsset by the government allow consumers to believethat companies will protect their disclosed infor-mation post-contractually, thereby increasing theirperceived control over their personal information(e.g., Tang et al. 2008).

Our focus of this research, however, is not onthe direct effects of the three privacy assuranceapproaches. Rather, we are interested in the extent towhich personal control enhancing mechanism inter-acts with proxy control enhancing mechanism in

4 See also the Location Privacy Protection Act of 2011 (S. 1223)Bill Summary: http://franken.senate.gov/files/docs/110614_The_Location_Privacy_Protection_Act_of_2011_One_pager.pdf.

affecting control perceptions. In addition, becausethese mechanisms do not appear in isolation in prac-tice, their individual effects are less likely to beecologically informative than are their interactioneffects. Accordingly, we propose a theoretical possi-bility based on the notion that individuals tend tominimize the amount of effort on information pro-cessing and would not process more information thannecessary (Berghel 1997, Eppler and Mengis 2004).Therefore, if one of the two control agencies is suffi-cient to make a relatively risk-free judgment, the exis-tence of other mechanisms may not matter much. Toinvestigate this proposition, we look into the interac-tion effects of the two control enhancing mechanismsby comparing the sources of control agency for eachmechanism.

As discussed above, the differences in the threeprivacy assurance approaches can be attributed tothe differences in control agency (which is either selfagency triggering personal control or powerful oth-ers agency triggering proxy control). When individ-uals are placed in control of situations that affectthem (i.e., control agency is self), they would oftenfeel greater autonomy (Yamaguchi 2001). For exam-ple, in a study about location-based applications,Barkhuus and Dey (2003) found that when users areprovided the options to indicate their own settingsof how an application should collect their locationinformation, they feel more in control of their inter-action with the technology. Cognitively, self agencymotivates greater user engagement and involvement,which is likely to result in positive attitudes givenits guaranteed consonance with individual interests(Yamaguchi 2001). Empirical evidence has shown thatimbuing the user with a sense of personal agencycan have a powerful effect on attitudes because ofits inherent egocentrism (Sundar and Marathe 2010).Individual self-protection that activates self agencyby its very nature is expected to provide users withcontrol of the flow of personal information fromtheir own hands. Thus, compared with proxy agencywhere control is placed in the hands of other par-ties, self agency via individual self-protection servesto inculcate a greater sense of agency in users them-selves, which could have a stronger impact on theirperceived control over personal information. In a par-ticular context of information disclosure, once peopledisclose their personal information, they have no abil-ity to know what is actually being done with it and noability to change practices they object to. Therefore,when there is self agency available for individuals tocontrol their personal information, they are likely tohave less of a need for proxy control via industry self-regulation or government legislation.

Hypothesis 2 (H2). The availability of personal con-trol diminishes the effects of proxy control via industryself-regulation.

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 11: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy ConcernsInformation Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS 1351

Hypothesis 3 (H3). The availability of personal con-trol diminishes the effects of proxy control via governmentlegislation.

3.3. Control VariablesScholars have identified two categories of factors thathave a direct bearing on consumers’ privacy con-cerns in a specific context, including (i) individual’spersonal characteristics and (ii) situational cues thatenable a person to assess the consequences of infor-mation disclosure (Xu et al. 2008). In this research,we have examined the availabilities of privacy assur-ance approaches (individual self-protection, industryself-regulation, and government legislation) as salientsituational cues that influence individuals’ privacyconcerns in the specific context of LBS. Factors relatedto individual characteristics other than those situa-tional cues mentioned previously may influence indi-viduals’ reactions to privacy concerns. To control forthose unknown effects, we have included interper-sonal differences as covariates in the model.

First, we included three demographic characteristics(Culnan 1995, Malhotra et al. 2004): age, gender, andeducation. Demographic differences have been foundto influence the degree of general privacy concerns.For example, it was found that those consumers whowere less likely to be concerned about privacy weremore likely to be young, male, and less educated(Culnan 1995, Sheehan 1999). Although prior researchonly demonstrated the influences of these demo-graphic variables in affecting general privacy concerns,we control for their influences because they couldpotentially affect the degree of privacy concerns in aspecific context of LBS.

Second, personality traits have been found to affectgeneral privacy concerns in prior research. In thisresearch, the individual personality differences weexamine are trust propensity and desire for infor-mation control. Trust propensity has been defined as“the extent to which a person displays a tendency tobe willing to depend on others across a broad spec-trum of situations and persons” (McKnight et al. 2002,p. 339). Trust propensity increases trust belief in a spe-cific context (e.g., toward a service provider; Pavlouand Gefen 2004) and thus may decrease individu-als’ specific concerns for information privacy (Huiet al. 2007). Therefore, we treat this factor as a con-trol variable in this research. Desire for information con-trol reflects individuals’ expected control over whatorganizations do with their information as well asthe amount and types of information organizationswill collect (Phelps et al. 2000). This variable has beenconceptualized as an individual’s general tendency todesire information control, which is possessed by allindividuals to a greater or lesser degree (Phelps et al.2000). Research suggests that consumers who desire

greater control over personal information were muchmore likely to have higher privacy concerns thanwere consumers who desire less control over per-sonal information (Phelps et al. 2000). Although priorresearch only demonstrated the influence of desirefor information control in influencing general privacyconcerns, we control for its influence because it couldpotentially influence the degree of privacy concernsin a specific context.

Third, we include interpersonal difference in termsof previous privacy experience as a covariate in themodel. Individuals who have been exposed to or beenthe victim of personal information abuses are foundto have stronger concerns for information privacy ingeneral (Smith et al. 1996). We argue that althoughprior research only demonstrated the influence of pre-vious privacy experience in affecting general privacyconcerns, we control for its influence because it maypotentially affect the degree of privacy concerns in aspecific context.

4. Research Method4.1. Experimental Design and ManipulationAn experimental study was conducted to testthe research hypotheses. Three privacy assuranceapproaches were manipulated independently, andtheir effects on individuals’ psychological responseswere examined. A location tracking service (location-aware mobile-coupon service) was utilized becauseLBS providers with this type of application tend to bemore aggressive in promoting their services throughactive tracking of user location information, therebyresulting in greater privacy concerns among users(Barkhuus and Dey 2003).

The experiment had a 2 × 2 × 2 factorial design.Eight experimental scenarios were created by manip-ulating the presence and absence of three privacyassurance approaches (individual self-protection,industry self-regulation, and government legisla-tion). Individual self-protection (ISP) was manipulatedby providing technological self-protection features,i.e., privacy control functions in mobile devices (seeAppendix C, which is available as part of theonline version that can be found at http://dx.doi.org/10.1287/isre.1120.0416). Based on Barkhuus and Dey(2003), we manipulated individual self-protectionby providing subjects with an interactive graphi-cal interface of a mobile device that allows themto restrict their location information released to theLBS provider. Because TRUSTe is widely adoptedand the seal is applicable for the wireless indus-try, industry self-regulation (REG) was manipulatedby showing subjects a TRUSTe seal with a URLlink to the privacy policy of the LBS providers (seeAppendix D1). A brief introduction explaining the

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 12: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy Concerns1352 Information Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS

mission of TRUSTe was also given to the subjectsto make sure they understood the significance ofthe TRUSTe seal (see Appendix D2). We manipu-lated government legislation (LEG) by telling the sub-jects that the use of LBS was governed by a newlyenacted location privacy law which covered the col-lection and use of their personal information. Subjectswere also presented with a piece of news related tothe recent enactment of the location privacy law (seeAppendix E).

4.2. Operationalization of Variables andPilot Study

Context-Specific Concerns for Information Privacy (CFIP):We adapted the scale of CFIP developed by Smithet al. (1996) to measure privacy concerns pertainingto LBS. It has been demonstrated in the privacy lit-erature (e.g., Angst and Agarwal 2009, Stewart andSegars 2002) that CFIP was better modeled as a reflec-tive second-order factor.5 When adapting CFIP intothe Internet context to develop the scale of IUIPC,Malhotra et al. (2004) theorized and operationalizedIUIPC using a reflective second-order factor. Giventhe strong theoretical and empirical evidence, we con-ceptualized CFIP as a reflective second-order factorcomprising four first-order components: collection ofpersonal information, unauthorized secondary use of per-sonal information, errors in personal information, andimproper access to personal information. Changes weremade to the instrument of CFIP to reflect privacy con-cerns specific toward the LBS by replacing the wordcompanies to the company, referred to as the specificservice provider of LBS.

Perceived Control over Personal Information (CTL) wasmeasured using five reflective indicators.6 Prior pri-vacy literature (Goodwin 1991, Hoffman et al. 1999,Phelps et al. 2000, Spiekermann and Cranor 2009)noted that control over personal information maybe governed by two larger dimensions: (1) controlover information release at the frontend where personaldata flow in and out of users and (2) control overunwanted access and use of personal information at thebackend where personal data are processed, stored,and transferred across different data sharing networks

5 Slyke et al. (2006) was the exceptional case in which CFIP wasmodeled as a formative second-order factor. We conducted robust-ness test by re-running the PLS analyses with CFIP as a formativefactor. Some path coefficients in this new model were slightly dif-ferent from our current model in which CFIP was modeled as areflective factor. But there was no significant change in terms of theoverall pattern of results.6 Similarly to the case of CFIP, we conducted robustness tests byre-running the PLS analyses with CTL as a formative factor. Somepath coefficients in this new model were slightly different from ourcurrent model in which CTL was modeled as a reflective factor. Butthere was no significant change in terms of the overall pattern ofresults.

and infrastructure. We adapted the items measuringperceived control in the context of health psychology(Reed et al. 1993) to focus on perceived control overpersonal information in the context of privacy. We inte-grated elements related to the measures of perceivedcontrol over information release and over unwantedaccess and use.

Control Variables. To the extent possible, scales forthe control variables were adapted from prior studiesto fit the context of this research. Desire for informationcontrol was assessed with three indicators that weretaken from Phelps et al. (2000). The measurementitems focus on an individual’s expected control overwhat companies do with their information as wellas the amount and types of information companieswill collect (Phelps et al. 2000). Privacy experience wasmeasured with three questions adapted from Smithet al. (1996), and trust propensity was measured withthree questions taken from McKnight et al. (2002). SeeAppendix A for the operationalization details.

The initial set of items was reviewed by sixinformation systems faculty members and doctoralstudents and modestly modified as a result of thefeedback. Next, we conducted a pilot study involv-ing 86 graduate and undergraduate students to assessthe strength of the manipulations, gauge the clarityof the questions, and verify the clarity and con-ciseness of the experimental procedure and instruc-tions. The participants provided detailed feedbackthrough interviews. Based on this feedback, we clar-ified and streamlined the experimental instructions,reorganized the instrument layout, and rewordedsome items.

4.3. Procedure and TaskAt the start of each experimental session, the sub-jects were told that all instructions were availableonline and that they should read the instructionscarefully and complete the experiment independently.After logging into the Web-based experimental sys-tem, the subjects answered a pre-session question-naire that collected personal information for controlchecks. Next, we provided a cover story to all sub-jects (see Appendix B). They were told that Com-pany A would soon be introducing an LBS application(location-aware mobile-coupon service) in the marketand that their feedback was being solicited to evalu-ate this service. General information about the serviceprovider and the service was also provided to the sub-jects. The experimental system tracked the browsinghistory of the subjects to ensure that all the pertinentexperimental materials were read.

Next, the Web-based experiment system randomlygenerated an experimental scenario such that eachsubject had an equal and independent chance ofbeing assigned to any of the eight experimental

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 13: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy ConcernsInformation Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS 1353

treatments. Subjects assigned to treatments involv-ing individual self-protection were provided with theinteractive graphical interface of privacy enhancingfeatures (see Appendix C). Subjects assigned to treat-ments involving industry self-regulation were shownthe TRUSTe seal and the URL link to the privacypolicy (see Appendix D1 and D2). Finally, subjectsassigned to treatments involving government legis-lation were given the privacy protection laws anda piece of enforcement news (see Appendix E). Theorder for presenting these treatments was random-ized by our online experiment system. To concludethe experiment, the subjects answered a post-sessionquestionnaire that contained the questions measuringperceived control, context-specific concerns for infor-mation privacy, and other information for manipula-tion checks. Subject responses to the meaningfulnessof the task (see Appendix A for questions used fortask validity check) were significantly higher than theneutral value.

4.4. ParticipantsWe recruited participants by posting announcementson the major Web portals in Singapore. The post-ing provided some background about the researchersand the study without revealing the experimentaltreatments and invited forum participants to com-plete this study. Those who chose to participate couldeasily do so by clicking on the URL provided inthe posting, which would take them into the experi-mental system. This recruitment procedure yielded atotal of 198 subjects. To motivate participants to com-plete this study, a lottery with four prizes7 (a mobilephone, an MP3 player, a Bluetooth headset, and a cashprize) was offered. The strategy of rewarding sub-jects through raffle prizes in exchange for divulgingpersonal attitudes or behaviors is well applied inthe survey methodology (e.g., Pavlou et al. 2007,Wang and Benbasat 2009). Among the respondents,54% were males and 46% were females; 57% respon-dents reported that they had no experience with usingLBS for the past six months whereas 43% reportedthat they has some usage experience. Specific demo-graphic information is shown in Appendix F.

Our sampling approach lacks the report on thenumber of individuals who have seen the requestfor participation, which is different from the tradi-tional survey sampling approach where a responserate is calculated and reported to compare thedemographics of the sample against those of thepopulation. To address this sampling concern, wecompared the demographic characteristics of our

7 We believe that the amount of raffled prizes offered in this studyis appropriate. The amount of our highest prize (around US$85) islower than that (US$100) offered in Pavlou et al. (2007).

respondents with the general demographic character-istics of the whole Singaporean mobile phone usersreported by Singapore Statistics8 and the annual sur-vey by Infocomm Development Authority (iDA).9 Therespondents did not differ from a nationally represen-tative sample in terms of gender ratio, age, mobilephone usage, and Internet and online shopping expe-rience. Nonresponse bias (Armstrong and Overton1977) was also assessed by verifying that (1) therespondents’ demographics are consistent with cur-rent mobile phone users in Singapore and (2) thatearly and late respondents were not significantly dif-ferent. Early respondents were those who respondedwithin the first week (48%). The two groups of earlyand late respondents were compared based on theirdemographics (age, gender, education, income, andmobile application usage experience); perceived con-trol; privacy concerns; desire for information control;trust propensity; and privacy experience. All t-testcomparisons between the means of the two groupsshowed insignificant differences. Thus, we believethat the demographics of our participants who clickedon the link to our experiment are roughly consis-tent with those of the general mobile phone users inSingapore.

5. Data Analyses and Results5.1. Manipulation and Control ChecksExperimental manipulations were checked by threesteps. First, JavaScript programming was used to logthe time each subject spent on the treatment page inorder to verify that subjects had browsed the mate-rials pertaining to their respective treatments. Datafrom 12 subjects were dropped because these subjectsspent fewer than 10 seconds reading the manipulationmaterials. Second, experimental manipulations werechecked using yes/no questions for the existence orabsence of the privacy assurance approaches in orderto confirm their awareness of the assurances. Datafrom eight subjects were dropped because they failedto correctly respond to the manipulation check ques-tions. Third, we included the perceptual questions totest the effectiveness of the manipulations. Paired-sample T -tests show that all treatments were manip-ulated effectively. Specifically, participants in presentISP treatment group believed that they can bettercontrol when the service provider can track and com-municate with their mobile devices than the partici-pants in absent ISP treatment did (t = 90921 p < 00001).Participants in present REG treatment group believedthat the service provider was less likely to violatetheir privacy and could protect their data better

8 http://www.singstat.gov.sg/keystats/annual/indicators.html.9 http://www.ida.gov.sg/Publications/20061205092557.aspx.

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 14: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy Concerns1354 Information Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS

than the participants in absent REG treatment did(t = 120651 p < 00001). Participants in present LEG treat-ment group believed that relevant legislation couldbetter govern the protection of their private informa-tion than the participants in absent LEG treatment did(t = 160621 p < 00001). These three-step manipulationchecks resulted in a data set of 178 usable and validresponses.

In addition, Mann-Whitney tests showed that gen-der ratio and education level of subjects did not differsignificantly across the various treatments. ANOVAtests revealed that subjects assigned to the varioustreatments did not differ significantly in terms of theirage, income, mobile device usage experience, desirefor information control, trust propensity, and privacyexperience. Hence, the random assignment of subjectsto the various treatments appeared to be effective.

5.2. Measurement ValidationA second-generation causal modeling statistical tech-nique—partial least squares (PLS)—was used fordata analysis. We evaluated the measurement modelby examining the convergent validity and discrimi-nant validity of the research instrument. Convergentvalidity is the degree to which different attemptsto measure the same construct agree (Cook andCampbell 1979). In PLS, three tests are used to deter-mine the convergent validity of measured reflectiveconstructs in a single instrument: reliability of items,composite reliability of constructs, and average vari-ance extracted by constructs. We assessed item reli-ability by examining the loading of each item onthe construct and found that the loadings for allthe items exceeded 0.65 (see Appendix G), indicatingadequate reliability (Falk and Miller 1992). Compos-ite reliabilities of constructs with multiple indicatorsexceeded Nunnally’s (1978) criterion of 0.7. The aver-age variances extracted for the constructs were allabove 0.5, and the Cronbach’s alphas were also allhigher than 0.7. As can be seen from the confirmatoryfactor analysis (CFA) results in Appendix G10 and thereliability scores in Table 2, these results support theconvergent validity of the measurement model.

For the construct of CFIP, it is modeled as asecond-order variable, reflecting four first-order fac-tors, namely collection, unauthorized access, error, andsecondary use. According to Wetzels et al. (2009), thereare two approaches to estimate models with second-order factors in PLS path modeling: the repeated use

10 To perform CFA in PLS, the following procedure was suggestedby Chin (1998) and applied in Agarwal and Karahanna (2000): Theloadings for the construct’s own indicators were provided by PLS.To calculate cross-loadings, a factor score for each construct wascalculated based on the weighted sum of the construct’s indicators.Then these factor scores were correlated with all other indicatorsto calculate cross-loadings of other indicators on the construct.

of manifest indicators approach (Wetzels et al. 2009)and the latent variable score approach (Chin 1998).In this research, we adopted the latter approach tomeasure the second-order construct of CFIP usingsummated scales, which were represented by factorscores derived from the confirmatory factor analysis(Agarwal and Karahanna 2000).

Discriminant validity is the degree to which mea-sures of different constructs are distinct (Campbelland Fiske 1959). To test discriminant validity (Chin1998): (1) indicators should load more strongly ontheir corresponding construct than on other constructsin the model, and (2) the square root of the varianceshared between a construct and its measures shouldbe greater than the correlations between the constructand any other construct in the model. As shown inAppendix G, all factor loadings are higher than cross-loadings. Furthermore, as shown by comparing thediagonal to the nondiagonal elements in Table 2, allconstructs share more variance with their indicatorsthan with other constructs. Therefore, all items ful-filled the requirement of discriminant validity.

Finally, although common method bias might notbe a major concern when measures of independentand dependent variables are obtained from differentsources (Podsakoff et al. 2003) in an experimentalstudy, we incorporated the considerations of address-ing this concern in our research. First, in the researchand instrument design, to control acquiescence bias,we used bipolar scales and provided verbal labelsfor the midpoints of scales. Based on the feedbackfrom pilot tests, we provided examples for thoseterms or concepts with which subjects may be unfa-miliar. Second, following Podsakoff et al. (2003), weran Harman’s single-factor test to test for commonmethod variance. If common method variance werea serious problem in the study, we would expecta single factor to emerge from a factor analysis orone general factor to account for most of the covari-ance among measures (Podsakoff et al. 2003). We per-formed an exploratory factor analysis by loading allitems on a single factor. No general factor was appar-ent in the unrotated factor structure, with Factor 1accounting for 22% of the variance, indicating thatcommon method variance is unlikely to be a seri-ous problem in the data. Third, we ran Lindell andWhitney’s (2001) test that uses a theoretically unre-lated construct (termed a marker variable), which wasused to adjust the correlations among the principalconstructs (Malhotra et al. 2006, Son and Kim 2008).We assessed correlation between the marker vari-able and our research constructs because they wereassumed to have no relationships. The results indi-cated that the average correlation coefficient was close

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 15: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy ConcernsInformation Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS 1355

Table 2 Construct Correlation Matrix

ISP × ISP ×

CTL COL AC ER USE DIC EXP TRU AGE SEX ED ISP REG LEG LEG REG

CTL 0089COL −0041 0088AC −0056 0076 0093ER −0038 0057 0070 0094USE −0059 0069 0088 0063 0096DIC 0005 0013 0010 0008 0008 0094EXP −0004 0024 0019 0017 0021 0019 0084TRU 0012 −0009 −0014 −0012 −0015 0014 0003 0079AGE −0016 0010 0008 0010 0014 0003 0014 0001 1000SEX 0017 0003 0007 0009 0011 0004 −0002 0012 −0014 1000ED −0005 0007 0003 0008 0001 0002 0005 0005 0025 −0020 1000ISP 0020 −0013 −0008 −0002 −0008 −0001 −0012 0007 −0012 0002 0007 1000REG 0039 −0015 −0017 −0015 −0019 0011 −0003 0005 −0009 0007 −0008 −0001 1000LEG 0032 −0001 −0013 −0001 −0020 −0010 0011 0005 −0005 0007 −0003 −0001 −0001 1000ISP × LEG −0020 0008 0009 0008 0015 −0003 −0011 0004 0004 −0008 0015 0001 −0002 −0001 1000ISP × REG −0004 0005 0004 0015 0010 −0013 −0015 0002 0006 −0012 −0003 0001 −0001 −0002 −0001 1000

Notes. CTL = perceived control, COL = Collection, AC = Unauthorized Access, ER = Error, USE = Secondary Use, DIC = Desire for Info Control,EXP = Privacy Experience, TRU = Trust propensity, AGE = Age, SEX = Gender, ED = Education level, ISP = Individual Self-protection, REG =

Industry Self-regulation, LEG = Government Legislation. Value on the diagonal is the square root of average variance extracted (AVE).

to zero (r = 0002, n.s.).11 Thus, it seems reasonableto argue that this present study is relatively robustagainst common method biases.

5.3. Testing the Structural ModelAfter establishing the validity of the measures,we conducted the PLS analysis to test the hypotheses.Because PLS does not generate any overall good-ness of fit indices, predictive validity is assessed pri-marily through an examination of the explanatorypower and significance of the hypothesized paths.The explanatory power of the structural model isassessed based on the amount of variance explainedin the endogenous construct (i.e., context-specific con-cerns for information privacy).

Table 3 depicts the structural models. We estimatedfour models. Model 1 is the full model with interac-tion effects, whereas Model 2 is a model with onlymain effects. Model 3 includes only the theoreticalvariables (excluding control variables) as predictors.Model 4 is a controls-only model that provides abenchmark for assessing the additional impacts of thetheoretical variables. An examination of the resultsfor the full model (Model 1) reveals that the pathcoefficient expressing the effect of perceived controlon CFIP is −0060 and highly significant (t = 9092).Hypothesis H1 is thus supported: when perceivedcontrol over personal information increases, CFIPdecreases. A comparison of Models 1 and 4 showsthat the full model explains a substantive incremen-tal variance of 30.3% (4007% − 1004%). In contrast,

11 We measured the marker variable of fashion leadership based onthose items (see Appendix A) from Goldsmith et al. (1993).

including the control variables on top of the inde-pendent variables only explains an additional 5.3%(4007% − 3504%) of the variance, as shown by a com-parison of Models 1 and 3. These results suggest thatour theoretical model is substantive enough to explaina large proportion of the variance in context-specificconcerns for information privacy.

Table 3 Path Coefficients for Structural Model

Model 1 Model 2 Model 3 Model 4interaction main effects theoretical control

Effect model model constructs constructs

Perceived Control (CTL)Individual self-protection 0021∗∗ 0022∗∗ 0021∗∗

(ISP )Industry self-regulation 0039∗∗ 0037∗∗ 0039∗∗

(REG)Government legislation 0032∗∗ 0034∗∗ 0032∗∗

(LEG)ISP × LEG −0019∗ −0019∗

ISP × REG −0004 −0004R2 (%) 3305 2905 3305

Context-Specific Concernsfor Information Privacy

(CFIP )Perceived control −0060∗∗ −0060∗∗ −0060∗∗

Age 0001 0001 0009Gender 0002 0001 0004Education level 0001 0001 0004Desire for information 0008 0008 0010

controlTrust propensity −0004 −0004 −0013Privacy experience 0016∗ 0015∗ 0021∗

R2 (%) 4007 4000 3504 1004

∗p < 0005, ∗∗p < 0001.

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 16: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy Concerns1356 Information Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS

5.3.1. Interaction Test. We hypothesized the inter-action effects between personal control agency andproxy control agency (H2 and H3), which were testedusing PLS (see Model 1 of Table 3). As H3 pre-dicted, there was a significant interaction betweenindividual self-protection and government legislation(b = −00191 p < 00051ãR2 = 4%). But the interactionbetween individual self-protection and industry self-regulation (H2) was not significant (see Model 1 ofTable 2). The test for the significant interaction effectwas conducted by following Carte and Russell (2003),to examine whether the variance explained becauseof the interaction effect is significant beyond the maineffects, using the F -statistic 8 = 6ãR2/4dfinteraction −

dfmain57/641 − R2interaction5/4N − dfinteraction − 1579. The

F -statistic was F 4211725 = 5017 4p < 00015, therebysupporting the significant interaction effect betweenindividual self-protection and government legislation.

To further validate the interaction effect betweenindividual self-protection and government legislation, wefollowed the Chin et al. approach (2003) that wasadopted in Im and Rai (2008) to perform the test com-paring the R2 values between the main and interac-tion effects by using Cohen’s f 2 6= 4R2

interaction −R2main5/

41 − R2main57. Controlling for the study’s control vari-

ables, the variance explained on perceived control was33.5% when accounting for the interaction effect,whereas only 29.5% was explained with only the maineffects. The interaction constructs have the effect sizef 2 of 0.06, which is between a small and mediumeffect (Chin et al. 2003), thus confirming H3.

5.3.2. Mediation Test. In our theoretical model,we posited that perceived control would mediate therelationship between privacy assurance approachesand CFIP. To test for this mediation, we followedNicolaou and McKnight’s approach (2006) to conducttwo related techniques. First, power analysis may pro-vide information about the significance of omittedpaths in a reduced model (Cohen 1988). We restrictedthe research model to include only the paths fromprivacy assurance approaches (ISP, REG, and LEG) tocontext-specific privacy concerns. We found all three pri-vacy assurance approaches have direct positive effectson CFIP (see Figure 2(a)). However, the percentageof variance explained for context-specific privacy con-cerns decreased from 35.4% (see Model 3 of Table 2) to18.6% (see Figure 2(a)). Chin (1998) recommends thecalculation of an effect size because of the omission ofpaths from the model: The effect size when the per-ceived control path is omitted equals 0.26, which isbetween a medium and large effect. This shows thatperceived control has an important effect on CFIP andthat researchers should not exclude it from the model.

Second, Baron and Kenny (1986) suggested thatmediation is demonstrated if three conditions are ful-filled: the first condition stipulates that the indepen-dent variable must significantly affect the proposed

Figure 2(a) PLS Model Without Perceived Control

R2 = 18.6%–0.18**

–0.32**

–0.15*Individual self-protection

Context-specificprivacy

concerns

Industry self-regulation

Government legislation

Figure 2(b) PLS Model with Perceived Control

–0.55** R2 = 37.9%

–0.09

–0.19**

–0.07Individual self-protection

Context-specificprivacy

concerns

Industry self-regulation

Government legislation

Perceived control

Note. ∗p < 0005, ∗∗p < 0001.

mediator. As the Table 3 shows, the relationshipsbetween the proposed mediator (perceived control) andthe three independent variables (ISP, REG, and LEG)were significant. The second condition requires theindependent variable must significantly affect thedependent variable. The PLS results (see Figure 2(a))demonstrated that the three independent variables(ISP, REG, and LEG) were significantly related tothe dependent variable (CFIP). The last conditionstipulates that the relationship between the indepen-dent variable and the dependent variable should beweaker or insignificant when the proposed media-tor is included than when the proposed mediator isnot included. The results (see Figure 2(b)) indicatedthat the path coefficient for individual self-protection(b = −0007) and the path coefficient for government leg-islation (b = −0009) were not significant when perceivedcontrol was included in the model; the path coeffi-cient for industry self-regulation (b = −0032 comparedwith b = −0019) was lower when perceived control wasincluded in the model. Figures 2(a) and 2(b) summa-rize the results for testing the mediating effect of per-ceived control, which satisfied the conditions needed toestablish mediation by Baron and Kenny (1986).

Sobel (1982) tests were also conducted as a meansof further examining evidence for mediation aboveand beyond procedures recommended by Baron andKenny (1986). This test is designed to assess whethera mediating variable significantly carries the influenceof an independent variable to a dependent vari-able, i.e., whether the indirect effect of the indepen-dent variable on the dependent variable through themediator variable is significant. Results of Sobel testssupported the mediating effects of perceived con-trol for (i) the relationship between ISP and CFIP

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 17: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy ConcernsInformation Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS 1357

(z= 20961 p < 0001), (ii) the relationship between REGand CFIP (z = 40341 p < 0001), and (iii) the relation-ship between LEG and CFIP (z= 30331 p < 0001). Thus,all of the statistical tests supported perceived controlas mediating the relationships between privacy assur-ance approaches (ISP, REG, and LEG) and CFIP.

6. Discussions and Implications6.1. Discussions of FindingsThis study seeks to clarify the nature of controlin the context of information privacy to generateinsights into the effects of different privacy assur-ance approaches on context-specific concerns forinformation privacy. We theorized that such effectsare exhibited through mediation by perceived controlover personal information and developed argumentsin support of the interactive effects involving differentprivacy assurance approaches. In general, the resultssupport our core assertion that perceived control overpersonal information is a key factor affecting context-specific concerns for information privacy. We extendthe literature by investigating the interaction amongprivacy assurance approaches that provide individu-als with personal control or proxy control.

By exploring the interaction effects of three pri-vacy assurance approaches, this study takes theprivacy discourse beyond a general discussion of pri-vacy concerns. It generates insights into how thecontext-specific privacy concerns of individuals canbe alleviated by raising their perceived control overthe collection and use of their personal informationin a specific context. Given the debate among schol-ars and practitioners on the relative effectiveness ofthese three (and other) approaches for reducing pri-vacy concerns (Culnan and Bies 2003), these insightscan serve as a theoretical basis for further effortsto generate insights on this topic. Beyond confirm-ing prior findings that these three privacy assuranceapproaches have a direct effect on enhancing controlperceptions (thereby alleviating privacy concerns) in aspecific context, this study focuses on the interactionsamong these approaches. Specifically, two interactionsare proposed and one is supported (involving indi-vidual self-protection and government legislation).Results show that the presence of individual self-protection offering personal control diminishes theimpact of government legislation offering proxy con-trol, but it does not diminish the impact of proxy con-trol via industry self-regulation (as hypothesized).

One plausible explanation for this insignificantinteraction is that individuals may perceive that theindustry self-regulators are generally lacking enforce-ment authority. Hence individuals may not regardindustry self-regulators (such as TRUSTe) as powerfulothers that can exercise proxy control for them. Prior

studies have demonstrated weak effects of industryself-regulations on addressing consumer privacy con-cerns (Hui et al. 2007, Moores 2005, Moores andDhillon 2003). In fact, Edelman (2011) recently pointedout that some industry self-regulators such as pri-vacy certification authorities have failed to pursuecomplaints against major companies whose privacybreaches were found to be “inadvertent.” Therefore,in the event that individuals’ personal informationhas been misused, they may not have an effectivemeans to seek redress because there is reasonablebasis to question the enforcement power of industryself-regulators to ensure merchants act according totheir privacy policies.

To obtain further insight into the interaction thatis contrary to the prediction, we conducted a posthoc analysis using ANOVA. The ANOVA results(see Appendix H) confirmed the interaction pat-tern demonstrated in PLS: there was a significantinteraction between individual self-protection and gov-ernment legislation (F = 40341 p < 0005), but the interac-tion between individual self-protection and industryself-regulation was not significant (F = 00601 p = 0044).However, the ANOVA results also revealed an unex-pected significant interaction between industry self-regulation and government legislation (F = 40731 p <0005). After plotting the significant interaction effectswith t-tests, we found substitution effects from thesetwo significant interaction effects: one between indi-vidual self-protection and government legislation (ISP ×

LEG) and the other between industry self-regulationand government legislation (REG×LEG). As shown inAppendix H, the first interaction effect shows that thedifference between the two ISP conditions in the noLEG condition is significant, whereas this difference isnot significant in the LEG condition. Similarly, the dif-ference between the two LEG conditions is significantin the no ISP condition, whereas this difference is notsignificant in the ISP condition. These results suggestthat for the two types of control assurance, i.e., per-sonal control via ISP and proxy control via LEG, onecould substitute for the other to some extent.

Along with the aforementioned interaction effectbetween individual self-protection and government leg-islation (ISP × LEG), the relationship between indus-try self-regulation and government legislation (REG×

LEG) shares a similar interaction pattern. The resultsshow that the difference between the two REG condi-tions in the no LEG condition is significant, whereasthis difference is not significant in the LEG condition.Similarly, the difference between the two LEG condi-tions is significant in the no REG condition, whereasthis difference is not significant in the REG condi-tion. These results suggest that for the two regulatoryapproaches (i.e., industry self-regulation and govern-ment legislation), one could substitute for the other tosome extent.

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 18: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy Concerns1358 Information Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS

Looking at the entire set of significant interac-tions, it seems that individuals understand individ-ual self-protection and industry self-regulation in asimilar fashion (even though the intent of this pri-vacy assurance mechanism may have been to offerproxy control). If this is the case, it would explainthe results showing that industry self-regulation inter-acts with government legislation but not with individ-ual self-protection. One possible explanation for thisresult could be because of the weak enforcementpower of industry self-regulators. Individuals mayperceive that they do not have an effective means toseek redress from the industry self-regulators whentheir personal information has been misused. Thus, asin the case of individual self-protection, individualshave to exercise careful discretion on what personalinformation to provide when industry self-regulationis present. It seems that the use of individual self-protection or industry self-regulation shifts much ofthe onus of protecting personal information to indi-viduals themselves. This may explain why individu-als did not regard industry self-regulators as powerfulothers that can exercise proxy control for them.

Because the correlation matrix (in Table 2) showedsubstantially different correlation values between per-ceived control and the first-order factors of CFIP (col-lection, unauthorized access, error, and secondary use),we conducted a post hoc analysis to see how per-ceived control impacts each first-order factor of CFIP(see Appendix I). Results showed that perceived controlplayed a significant role in alleviating privacy con-cerns related to all four aspects. Among the relation-ships between perceived control and the four first-orderfactors of CFIP, perceived control was more influentialin addressing concerns for secondary use (b = −0061)and unauthorized access (b = −0057) than it was forcollection (b = −0041) and error (b = −0039). Theseobservations made about the variations across thefour first-order factors suggest that there is value ininvestigating these four first-order factors of CFIP infuture research at a more fine-grained level. Fromthe practical perspective, LBS practitioners shouldpay more attention to develop and deploy privacycontrol mechanisms that can limit secondary useof personal information and unauthorized access topersonal information.

Examining control variables in the structural mod-els also offers some insights. Among three types ofpersonal characteristics (demographic differences, person-ality traits, and previous privacy experience), only previ-ous privacy experience was found to have a significanteffect on privacy concerns. This implies that thosewho have encountered negative privacy experiencesare more aware of undesirable consequences of dis-closing personal information in the LBS context basedon previous experience. Interestingly, most of the

items in our list of personal characteristics (demo-graphic differences and personality traits such asdesire for information control and trust propensity)were shown to have insignificant influences on pri-vacy concerns in the specific context of LBS. Althoughthese variables were found to be significant whenincluded as the predictors of general privacy con-cerns in prior research, their effects were shown tobe overridden by context-specific factors that tie theassessment of privacy concerns to specific servicesin this research. This is consistent with the proposi-tion of privacy paradox (Acquisti and Grossklags 2005)that individuals’ stated levels of general privacy con-cerns often deviate from or are even contradictoryto their actual privacy assessment in a specific con-text. As Berendt et al. (2005) demonstrated throughan experimental study, individuals could easily for-get their stated general privacy concerns and dis-close very personal details when interacting with anentertaining website. Accordingly, we suggest thatfuture privacy research should not only examine pri-vacy concerns at a general level but also considersituational elements and contextual differences at aspecific level.

6.2. Limitations and Future ResearchThere are several limitations in this study thatpresent useful opportunities for further research.First, we conducted the study in Singapore, whichhas a strong reputation for enforcing governmentlegislation (Harding 2001). Thus, the subjects mayhave well-formed and powerful beliefs about proxycontrol through government legislation. In countrieswhere such government legislation is generally lack-ing or where enforcement of government legislationis limited, individuals may have a weaker preferencefor proxy control through government legislation. AsSmith (2004) pointed out, different countries haveapproached privacy issues differently in various reg-ulatory structures (with “omnibus” privacy bills or a“patchwork” of sector-specific laws). Therefore, futureresearch should be conducted in other countries toprovide further insights into the effects of privacyassurance approaches.

Second, we have employed a manipulation ofprivacy-enhancing technology as one type of individ-ual self-protection approaches, which is commonlyused in the practice of LBS. Future research caninvestigate whether nontechnological self-protectionapproaches (e.g., reading privacy policy, refusal,misrepresentation, removal, negative word-of-mouth,complaining directly to online companies, and com-plaining indirectly to third-party organizations; Sonand Kim 2008) would yield the same impact as theprivacy-enhancing technology in terms of raising per-ceived control over personal information.

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 19: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy ConcernsInformation Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS 1359

Third, we have included a number of individ-ual differences as control variables in this study.Future research can examine whether such differ-ences moderate the relationship between the threeprivacy assurance approaches and perceived controlover personal information. In addition, the role ofpersonal disposition has been an important factor inbehavioral models, such as those related to trust andthe effect of personal disposition to trust on trustingbehavior (e.g., McKnight et al. 2002). In light of posi-tioning privacy concerns as context specific beliefs,it might be worthwhile to examine how disposition tovalue privacy (Xu et al. 2008) would impact context-specific privacy concerns. Fourth, and by design,this research is limited to the examination of themediation role of perceived control between privacyassurance approaches and context-specific privacyconcerns as well as the interaction effects involv-ing different privacy assurance approaches. There-fore, we have not extended the nomological networkto consider how those context-specific privacy con-cerns are translated into privacy-related intentionsand behaviors. An extension of this study can viewcontext-specific privacy concerns as a mediating vari-able in a larger nomological network, with it leadingto privacy-related behaviors.

In addition, the respondents for this study wererecruited from major Web portals in Singapore. Sucha sampling approach lacks the report on the num-ber of individuals who have seen the request forparticipation, which is different from the traditionalsurvey sampling approach where a response rate iscalculated and reported to compare the demograph-ics of the sample against those of the population.Care must be taken in any effort to generalize ourfindings beyond the boundary of our sample. Lastly,the research model has been developed and testedin a specific context of LBS. An implication forfuture research is to extend the theoretical frameworkdescribed in this research to other contexts that mayraise similar or extended privacy issues. For example,there has been much discussed in the public mediaabout privacy-related backlashes among social net-working sites (e.g., Brandimarte et al. 2010, Wanget al. 2011). It would be interesting to test the theoreti-cal framework developed here in the context of onlinesocial interactions to assess its applicability.

6.3. Theoretical ContributionsThis study aims to contribute to existing privacyresearch in several ways. First, our primary contribu-tion is to clarify the link between control and privacyin the context of LBS. Although the notion of con-trol is embedded in many privacy definitions and hasbeen used to operationalize privacy in instruments,its meaning has been ambiguous. In this research, weestablish the mediating role of perceived control in

mitigating privacy concerns in the context of LBS.Such separation of control from privacy concerns isimportant because it enables us to avoid conflatingthe concept of privacy with the concept of control.

Second, our proposed model has establishedthe importance of a psychological perspective ofcontrol in alleviating privacy concerns. Specifi-cally, we integrated the control agency theory intothe research model and examined the efficacy ofthree privacy assurance approaches (individual self-protection, industry self-regulation, and governmentregulation) in influencing privacy concerns throughthe mediating effect of perceived control in the con-text of LBS. This represents one of the few studiesthat theoretically differentiate three privacy assur-ance approaches by linking them with differenttypes of control agencies. Most importantly, althoughthe privacy literature has exclusively examined theindividual effect of privacy assurance approaches(e.g., Metzger 2006), this study extends the literatureby proposing interaction effects of these mechanismson alleviating privacy concerns, based on the typeof control agencies they can provide. This extensionis particularly relevant in today’s world, which hasa diversity of privacy assurance approaches. Particu-larly, this study derives theory-driven privacy inter-ventions as well as provides early empirical evidenceshowing that implementing more privacy assuranceapproaches does not necessarily lead to higher controlperceptions.

Third, to respond to the compelling call for researchinvestigating the role of technologies in influencingthe privacy theoretical development (Waldo et al.2007), we viewed the technological advancement as adouble-edged sword in this research: on the one hand,location-based technologies make the privacy chal-lenge particularly vexing; on the other hand, privacyprotections could be implemented through privacy-enhancing technologies (PETs). Although the nega-tive side of technological advancement has long beenhighlighted as the main driver of privacy concernsin various contexts (e.g., Internet, data mining, andprofiling), the positive side regarding the role of PETshas not been fully addressed in the IS field. Withthe active rolling out of PETs by the technologists(e.g., Squicciarini et al. 2011), it would be importantfor IS researchers to include the PETs in the exami-nation of individual self-protection approaches in theprivacy research.

6.4. Practical ImplicationsThe findings of this study have useful implications forLBS providers and individuals as well as regulatorybodies and LBS technology developers. To the extentthat perceived control is a key factor influencing pri-vacy concerns pertaining to LBS, measures that canincrease users’ perceived control over the collection

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 20: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy Concerns1360 Information Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS

and use of their personal information should helpindividuals’ privacy concerns pertaining to LBS.

Although the presence of three privacy assuranceapproaches has an overall effect, this study suggeststhat, for the two regulatory approaches of privacyassurance (i.e., industry self-regulation and govern-ment legislation), implementing one of them seemsadequate to install consumers’ confidence in control-ling their personal information in LBS. This findingsheds some light on the conflicts between the differentapproaches of the United States and European Unionto regulating privacy—the self-regulatory approachversus the comprehensive legislative approach. In theUnited States, self-regulation, particularly in the con-text of e-commerce and online social networks, is usedas a means to preempt the need for legislation, whichcan be more constraining to industry participants. Onthe opposite side of the spectrum, the European Uniontakes a stronger approach that relies on governmentenforcement of mandatory legal rules to ensure ade-quate privacy protection of personal information. Ourfinding adds to the self-regulation versus legislationdebate by suggesting that having a high degree of self-regulation nullifies the need for legislation, whereashaving a high degree of legislation nullifies the needfor self-regulation. In addition, this study suggeststhat for the legislative approach of privacy assur-ance and the individual self-protection approach, onecould substitute for the other to some extent. Becausegovernment legislation is usually more expensive toinstitute and only exists within a legal jurisdiction(Pavlou and Gefen 2004), promoting the individualself-protection approach should be increasingly per-ceived as a viable substitute for government legisla-tion approach because of its ability to cross interna-tional, regulatory, and business boundaries.

Results of this study lead to the recommendationthat for situations where regulatory bodies are notwilling to enact legislation to protect the privacy ofpersonal information or cannot enforce the legisla-tion when violations occur, the effects of individ-ual self-protection and industry self-regulation arelikely to be pronounced. Therefore, LBS technol-ogy developers can benefit from the promotion ofthe individual self-protection approaches (especiallythrough PETs) and LBS providers can increase theirbusiness by promoting industry self-regulation witha trusted third party. Without government legisla-tion in place, it is valuable to provide individu-als with individual self-protection and industry self-regulation. For example, as a startup company in theLBS industry, Loopt.com12 has promoted its services

12 Altman, S. “Best Practices for Location-based Services: Privacy,User Control, Carrier Relations, Advertising, and More,” Where2.0 Conference, Burlingame, CA, May 12–14, 2008. Available athttp://en.oreilly.com/where2008/public/schedule/detail/1700.

by emphasizing its privacy-enhancing features, itsTRUSTe membership, and its relationship with indus-try regulators to establish accepted standards of pri-vacy protection (e.g., Center for Democracy and Tech-nology and the Ponemon Institute). Considering thatthere has yet to be any international consensus onsuitable legal frameworks for privacy practices, it isunrealistic to expect the privacy laws to be carried outglobally. Therefore, a hybrid mechanism combiningindividual self-protection and industry self-regulationcan effectively provide individuals with control overthe collection and use of their personal information.Such a hybrid mechanism may become increasinglyprevalent globally.

7. ConclusionIn the years ahead, the ubiquitous computing maybe reinforced by rapid advances in location-basedtechnology that will continue to produce new mobileservices for individuals. This study is one of the firstattempt to develop a theory on privacy by adoptinga psychological control perspective in a specific con-text of LBS. Our results reveal that individual self-protection, industry self-regulation, and governmentlegislation interact to affect perceived control, whichin turn influences the context-specific privacy con-cerns of individuals. With an understanding of therationale for the interactions among the three pri-vacy assurance approaches, these results serve as abasis for future theoretical development in the area ofinformation privacy. Theoretical developments in thisdirection should also yield valuable insights that canguide practice.

Electronic CompanionAn electronic companion to this paper is available aspart of the online version at http://dx.doi.org/10.1287/isre.1120.0416.

AcknowledgmentsThe authors are very grateful to the senior editor, ElenaKarahanna, for her encouragement and direction in helpingthem develop this manuscript. They are also grateful to theassociate editor, Mariam Zahedi, for her very helpful guid-ance and to the three anonymous reviewers for their con-structive advice and helpful comments on earlier versionsof this manuscript. Heng Xu gratefully acknowledges thefinancial support of the National Science Foundation underGrant CNS-0953749.

ReferencesABI (2011) Location analytics and privacy: The impact of privacy on

LBS, location analytics, and advertising. Allied Business Intel-ligence Inc. http://www.abiresearch.com/research/1007751.

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 21: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy ConcernsInformation Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS 1361

Ackerman MS, Mainwaring SD (2005) Privacy issues and human-computer interaction. Garfinkel S, Cranor L, eds. Security andUsability: Designing Secure Systems That People Can Use (O’Reilly,Sebastopol, CA), 381–400.

Acquisti A, Grossklags J (2005) Privacy and rationality in individualdecision making. IEEE Security Privacy 3(1):26–33.

Agarwal R, Karahanna E (2000) Time flies when you’re having fun:Cognitive absorption and beliefs about information technologyusage. MIS Quart. 24(4):665–692.

Ajzen I (2001) Nature and operation of attitudes. Annual Rev. Psych.52(February):27–58.

Altman I (1976) Privacy: A conceptual analysis. Environment Behav.8(1):7–29.

Angst CM, Agarwal R (2009) Adoption of electronic health recordsin the presence of privacy concerns: The elaboration likelihoodmodel and individual persuasion. MIS Quart. 33(2):339–370.

Armstrong JS, Overton TS (1977) Estimating nonresponse bias inmail surveys. J. Marketing Res. 14(3):396–402.

Awad NF, Krishnan MS (2006) The personalization privacy para-dox: An empirical evaluation of information transparency andthe willingness to be profiled online for personalization. MISQuart. 30(1):13–28.

Bandura A (2001) Social cognitive theory: An agentic perspective.Annual Rev. Psych. 52(1):1–26.

Bansal G, Zahedi F, Gefen D (2008) The moderating influence of pri-vacy concern on the efficacy of privacy assurance mechanismsfor building trust: A multiple-context investigation. Proc. 29thAnnual Internat. Conf. Inform. Systems (ICIS 2008), Paris, France(AIS, Atlanta).

Bansal G, Zahedi FM, Gefen D (2010) The impact of personal dispo-sitions on information sensitivity, privacy concern and trust indisclosing health information online. Decision Support Systems49(2):138–150.

Barkhuus L, Dey A (2003) Location-based services for mobile tele-phony: A study of user’s privacy concerns. Proc. INTERACT,9th IFIP TC13 Internat. Conf. Human-Comput. Interaction, Zurich,Switzerland (IOS Press, Amsterdam), 709–712.

Barkhuus L, Brown B, Bell M, Sherwood S, Hall M, Chalmers M(2008) From awareness to repartee: Sharing location withinsocial groups. Proc. Twenty-Sixth Annual SIGCHI Conf. HumanFactors Comput. Systems (ACM, Florence, Italy), 497–506.

Baron RM, Kenny DA (1986) The moderator-mediator variable dis-tinction in social psychological research: Conceptual, strate-gic, and statistical considerations. J. Personality Soc. Psych.51(6):1173–1182.

Bellman S, Johnson EJ, Kobrin SJ, Lohse GL (2004) Internationaldifferences in information privacy concerns: A global surveyof consumers. Inform. Soc. 20(5, Nov–Dec):313–324.

Berendt B, Gunther O, Spiekermann S (2005) Privacy in e-com-merce: Stated preferences vs. actual behavior. Comm. ACM48(4):101–106.

Berghel H (1997) Cyberspace 2000: Dealing with information over-load. Comm. ACM 40(2):19–24.

Brandimarte L, Acquisti A, Loewenstein G (2010) Misplacedconfidences: Privacy and the control paradox. Proc. WorkshopEconom. Inform. Security (WEIS), Boston, June 2010, http://weis2010.econinfosec.org/papers/session2/weis2010_brandimarte.pdf.

Buchanan T, Paine C, Joinson AN, Reips U (2007) Development ofmeasures of online privacy concern and protection for use onthe internet. J. Amer. Soc. Inform. Sci. Tech. 58(2):157–165.

Burkert H (1997) Privacy-enhancing technologies: Typology, cri-tique, vision. Agre P, Rotenberg M, eds. Technology and Privacy:The New Landscape (MIT Press, Cambridge, MA), 126–143.

Campbell DT, Fiske DW (1959) Convergent and discriminant val-idation by the multitrait-multimethod matrix. Psych. Bulletin56(1):81–105.

Carte AT, Russell JC (2003) In pursuit of moderation: Nine commonerrors and their solutions. MIS Quart. 27(3):479–501.

Caudill EM, Murphy PE (2000) Consumer online privacy: Legaland ethical issues. J. Public Policy and Marketing 19(1):7–19.

Chin WW (1998) The partial least squares approach to structuralequation modeling. Marcoulides GA, ed. Modern Methods forBusiness Research (Lawrence Erlbaum Associates, Mahwah, NJ),295–336.

Chin WW, Marcolin BL, Newsted PR (2003) A partial least squareslatent variable modeling approach for measuring interactioneffects: Results from a Monte Carlo simulation study and anelectronic-mail emotion/adoption study. Inform. Systems Res.14(2):189–217.

Cohen J (1988) Statistical Power Analysis for the Behavioral Sciences,2nd ed. (L. Erlbaum Associates, Hillsdale, NJ).

Cook M, Campbell DT (1979) Quasi-Experimentation: Design andAnalysis Issues for Field Settings (Houghton Mifflin, Boston).

Cranor LF (2002) Web Privacy with P3P (O’Reilly & Associates,Sebastopol, CA).

CTIA (2008) Best practices and guidelines for location basedservices. The Cellular Telecommunications and InternetAssociation (CTIA). http://www.ctia.org/content/index.cfm/AID/11300.

Culnan MJ (1993) “How did they get my name”? An exploratoryinvestigation of consumer attitudes toward secondary informa-tion use. MIS Quart. 17(3):341–364.

Culnan MJ (1995) Consumer awareness of name removal proce-dures: Implication for direct marketing. J. Interactive Marketing9(2):10–19.

Culnan MJ (2000) Protecting privacy online: Is self-regulationworking? J. Public Policy and Marketing 19(1):20–26.

Culnan MJ, Armstrong PK (1999) Information privacy concerns,procedural fairness and impersonal trust: An empirical inves-tigation. Organ. Sci. 10(1):104–115.

Culnan MJ, Bies JR (2003) Consumer privacy: Balancing economicand justice considerations. J. Soc. Issues 59(2):323–342.

Culnan MJ, Williams CC (2009) How ethics can enhance organi-zational privacy: Lessons from the Choicepoint and TJX databreaches. MIS Quart. 33(4):673–687.

DeCew JW (1997) In Pursuit of Privacy: Law, Ethics, and the Rise ofTechnology (Cornell University Press, Ithaca, NY).

Dhillon GS, Moores T (2001) Internet privacy: Interpreting keyissues. Inform. Resources Management J. 14(4):33–37.

Dinev T, Hart P (2004) Internet privacy concerns and theirantecedents—measurement validity and a regression model.Behav. Inform. Tech. 23(6):413–423.

Dinev T, Hart P (2006) An extended privacy calculus model fore-commerce transactions. Inform. Systems Res. 17(1):61–80.

Edelman B (2011) Adverse selection in online “trust” certificationsand search results. Electronic Commerce Res. Appl. 10(1):17–25.

Eppler MJ, Mengis J (2004) The concept of information over-load: A review of literature from organization science, mar-keting, accounting, MIS, and related disciplines. Inform. Soc.20(5):325–344.

Falk RF, Miller NB (1992) A Primer for Soft Modeling (The Universityof Akron Press, Akron, OH).

FTC (2000) Privacy online: Fair information practices in the elec-tronic marketplace. Federal Trade Commission. http://www.ftc.gov/reports/privacy2000/privacy2000.pdf.

FTC (2009) Beyond voice: Mapping the mobile marketplace. Fed-eral Trade Commission. www.ftc.gov/opa/2009/04/mobilerpt.shtm.

FTC (2010) A preliminary FTC staff report on protecting consumerprivacy in an era of rapid change: A proposed frameworkfor businesses and policymakers. Federal Trade Commission.http://www.ftc.gov/opa/2010/12/privacyreport.shtm.

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 22: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy Concerns1362 Information Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS

Goldsmith RE, Freiden JB, Kilsheimer JC (1993) Social valuesand female fashion leadership: A cross-cultural study. Psych.Marketing 10(5):399–412.

Goodwin C (1991) Privacy: Recognition of a consumer right. J. Pub-lic Policy and Marketing 10(1):149–166.

Harding A (2001) Comparative law and legal transplantation inSouth East Asia. Nelken D, Feest J, eds. Adapting Legal Cultures(Hart Publishing, Oxford, Portland, OR), 199–222.

Hoadley MC, Xu H, Lee J, Rosson MB (2010) Privacy as informationaccess and illusory control: The case of the Facebook news feedprivacy outcry. Electronic Commerce Res. Appl. 9(1):50–60.

Hoffman DL, Novak TP, Peralta MA (1999) Information privacyin the marketspace: Implications for the commercial uses ofanonymity on the web. Inform. Soc. 15(2):129–139.

Hui KL, Teo HH, Lee SYT (2007) The value of privacy assurance:An exploratory field experiment. MIS Quart. 31(1):19–33.

Im G, Rai A (2008) Knowledge sharing ambidexterity inlong-term interorganizational relationships. Management Sci.54(7):1281–1296.

Johnson CA (1974) Privacy as personal control. Carson DH,ed. Man-Environment Interactions: Evaluations and Applica-tions: Part 2 (Environmental Design Research Association,Washington, DC), 83–100.

Junglas IA, Watson RT (2006) The U-constructs: Four informationdrives. Comm. AIS 17(1):569–592.

Junglas IA, Johnson NA, Spitzmüller C (2008) Personality traitsand concern for privacy: An empirical study in the context oflocation-based services. Eur. J. Inform. Systems 17(4):387–402.

Langer EJ (1975) The illusion of control. J. Personality Soc. Psych.32(2):311–328.

Laufer RS, Wolfe M (1977) Privacy as a concept and a socialissue—multidimensional developmental theory. J. Soc. Issues33(3):22–42.

Li H, Sarathy R, Xu H. (2011) The role of affect and cognition ononline consumers’ willingness to disclose personal informa-tion. Decision Support Systems 51(3):434–445.

Lindell MK, Whitney DJ (2001) Accounting for common methodvariance in cross-sectional research designs. J. Appl. Psych.86(1):114–121.

Malhotra KN, Kim SS, Agarwal J (2004) Internet users’ informationprivacy concerns (IUIPC): The construct, the scale, and a causalmodel. Inform. Systems Res. 15(4):336–355.

Malhotra KN, Kim SS, Patil A (2006) Common method variancein IS research: A comparison of alternative approaches and areanalysis of past research. Management Sci. 52(12):1865–1883.

Margulis TS (2003a) Privacy as a social issue and behavioral con-cept. J. Soc. Issues 59(2):243–261.

Margulis TS (2003b) On the status and contribution of Westin’s andAltman’s theories of privacy. J. Soc. Issues 59(2):411–429.

McKnight DH, Choudhury V, Kacmar C (2002) Developing and val-idating trust measures for e-commerce: An integrative typol-ogy. Inform. Systems Res. 13(3):334–359.

Metzger MJ (2006) Effects of site, vendor, and consumer characteris-tics on web site trust and disclosure. Comm. Res. 33(3):155–179.

Miller SM (1980) Why having control reduces stress: If I can stopthe roller coaster I don’t want to get off. Garder J, SeligmanMEP, eds. Human Helplessness: Theory and Applications (Aca-demic Press, New York), 71–95.

Milne GR, Culnan MJ (2004) Strategies for reducing online pri-vacy risks: Why consumers read (or don’t read) online privacynotices. J. Interactive Marketing 18(3):15–29.

Milne GR, Rohm A (2000) Consumer privacy and name removalacross direct marketing channels: Exploring opt-in and opt-outalternatives. J. Public Policy and Marketing 19(2):238–249.

Mowday RT, Sutton RI (1993) Organizational behavior: Linkingindividuals and groups to organizational contexts. Annual Rev.Psych. 44(1):195–229.

Moores T (2005) Do consumers understand the role of privacy sealsin e-commerce? Comm. ACM 48(3):86–91.

Moores T, Dhillon G (2003) Do privacy seals in e-commerce reallywork? Comm. ACM 46(12):265–271.

Nicolaou AI, McKnight DH (2006) Perceived information qualityin data exchanges: Effects on risk, trust, and intention to use.Inform. Systems Res. 17(4):332–351.

Nunnally JC (1978) Psychometric Theory, 2nd ed. (McGraw-Hill,New York).

Oded N, Sunil W (2009) Social computing privacy concerns:Antecedents and effects. Proc. 27th Internat. Conf. Human Fac-tors Comput. Systems (CHI) (ACM, Boston), 333–336.

Pavlou PA, Gefen D (2004) Building effective online marketplaceswith institution-based trust. Inform. Systems Res. 15(1):37–59.

Pavlou PA, Liang H, Xue Y (2007) Understanding and mitigatinguncertainty in online exchange relationships: A principal-agentperspective. MIS Quart. 31(1):105–136.

Phelps J, Nowak G, Ferrell E (2000) Privacy concerns and consumerwillingness to provide personal information. J. Public Policy andMarketing 19(1):27–41.

Podsakoff MP, MacKenzie BS, Lee JY, Podsakoff NP (2003) Com-mon method biases in behavioral research: A critical reviewof the literature and recommended remedies. J. Appl. Psych.88(5):879–890.

Reed GM, Taylor SE, Kemeny ME (1993) Perceived control andpsychological adjustment in gay men with AIDS. J. Appl. Soc.Psych. 23(10):791–824.

Schoeman FD (1984) Philosophical Dimensions of Privacy: AnAnthology (Cambridge University Press, Cambridge, UK).

Schwartz MP (1999) Privacy and democracy in cyberspace. Vander-bilt Law Rev. 52(6):1610–1701.

Sheehan KB (1999) An investigation of gender differences in on-line privacy concerns and resultant behaviors. J. InteractiveMarketing 13(4):24–38.

Sheehan KB (2002) Toward a typology of Internet users and onlineprivacy concerns. Inform. Soc. 18(1):21–32.

Skinner EA (1996) A guide to constructs of control. J. PersonalitySoc. Psych. 71(3):549–570.

Slyke CV, Shim JT, Johnson R, Jiang JJ (2006) Concern for informa-tion privacy and online consumer purchasing. J. Assoc. Inform.Systems 7(6):415–444.

Smith HJ (2004) Information privacy and its management. MISQuart. Executive 3(4):201–213.

Smith HJ, Dinev T, Xu H (2011) Information privacy research: Aninterdisciplinary review. MIS Quart. 35(4):989–1015.

Smith HJ, Milberg JS, Burke JS (1996) Information privacy: Measur-ing individuals’ concerns about organizational practices. MISQuart. 20(2, June):167–196.

Sobel ME (1982) Asymptotic intervals for indirect effects in struc-tural equations models. Leinhart S, ed. Sociological Methodology(Jossey-Bass, San Francisco), 290–312.

Solove DJ (2002) Conceptualizing privacy. California Law Rev.90(4):1087–1155.

Solove DJ (2006) A taxonomy of privacy. Univ. Pennsylvania LawRev. 154(3):477–560.

Solove DJ (2007) “I’ve got nothing to hide” and other misunder-standings of privacy. San Diego Law Rev. 44(4):745–772.

Son JY, Kim SS (2008) Internet users’ information privacy-protectiveresponses: A taxonomy and a nomological model. MIS Quart.32(3):503–529.

Spiekermann S, Cranor LF (2009) Engineering privacy. IEEE Trans.Software Engrg. 35(1):67–82.

Spiro WG, Houghteling LJ (1981) The Dynamics of Law, 2nd ed.(Harcourt Brace Jovanovich, New York).

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.

Page 23: Research Note Effects of Individual Self-Protection, …topic of privacy concerns (e.g., Angst and Agarwal 2009, Bansal et al. 2010, Dinev and Hart 2006, Malho- tra et al. 2004, Son

Xu et al.: Privacy Assurances, Perceived Control, and Privacy ConcernsInformation Systems Research 23(4), pp. 1342–1363, © 2012 INFORMS 1363

Squicciarini CA, Xu H, Zhang X (2011) CoPE: Enabling collabora-tive privacy management in online social networks. J. Amer.Soc. Inform. Sci. Tech. 62(3):521–534.

Stewart KA, Segars AH (2002) An empirical examination of the con-cern for information privacy instrument. Inform. Systems Res.13(1):36–49.

Sundar SS, Marathe SS (2010) Personalization vs. customization:The importance of agency, privacy, and power usage. HumanComm. Res. 36(3):298–322.

Swire PP (1997) Markets, self-regulation, and government enforce-ment in the protection of personal information. Daley WM,Irving L, eds. Privacy and Self-Regulation in the Information Age(Department of Commerce, Washington, DC), 3–19.

Tang Z, Hu YJ, Smith MD (2008) Gaining trust through onlineprivacy protection: Self-regulation, mandatory standards, orcaveat emptor. J. Management Inform. Systems 24(4):153–173.

TRUSTe (2004) TRUSTe plugs into wireless: TRUSTe’s wire-less advisory committee announces first wireless privacystandards. http://www.truste.org/articles/wireless_guidelines_0304.php.

Tsai YJ, Kelly GP, Cranor FL, Sadeh N (2010) Location-sharing tech-nologies: Privacy risks and controls. I/S: J. Law Policy Inform.Soc. 6(2):119–151.

Turner EC, Dasgupta S (2003) Privacy on the Web: An examinationof users concerns, technology, and implications for businessorganizations and individuals. Inform. Systems Management20(1):8–18.

Waldo J, Lin H, Millett LI (2007) Engaging Privacy and Informa-tion Technology in a Digital Age (National Academies Press,Washington, DC).

Wallston AK (2001) Conceptualization and operationalization ofperceived control. Baum A, Revenson TA, Singer JE, eds.Handbook of Health Psychology (Lawrence Erlbaum Associates,Mahwah, NJ), 49–58.

Wang W, Benbasat I (2009) Interactive decision aids for consumerdecision making in e-commerce: The influence of perceivedstrategy restrictiveness. MIS Quart. 33(2):293–320.

Wang N, Xu H, Grossklags J (2011) Third-party apps on Face-book: Privacy and the illusion of control. Proc. ACM Sym-pos. Comput.-Human Interaction for Management Inform. Tech.(CHIMIT), Boston (ACM, New York).

Weisz JR, Rothbaum FM, Blackburn TC (1984) Standing out andstanding in: The psychology of control in America and Japan.Amer. Psych. 39(9):955–969.

Westin AF (1967) Privacy and Freedom (Atheneum, New York).Wetzels M, Odekerken-Schroder G, van Oppen C (2009) Using

PLS path modeling for assessing hierarchical constructmodels: Guidelines and empirical illustration. MIS Quart.33(1):177–195.

Xu H, Dinev T, Smith HJ, Hart P (2008) Examining the formationof individual’s information privacy concerns: Toward an inte-grative view. Proc. 29th Annual Internat. Conf. Inform. Systems(ICIS 2008), Paris, France (AIS, Atlanta).

Xu H, Dinev, T, Smith HJ, Hart P (2011) Information privacy con-cerns: Liking individual perceptions with institutional privacyassurances. J. Assoc. for Inform. Systems 12(12): 798–824.

Xu H, Teo HH, Tan BCY, Agarwal R (2010) The role of push-pulltechnology in privacy calculus: The case of location-based ser-vices. J. Management Inform. Systems 26(3):135–174.

Yamaguchi S (2001) Culture and control orientations. Matsumoto D,ed. The Handbook of Culture and Psychology (Oxford UniversityPress, New York), 223–243.

Zweig D, Webster J (2002) Where is the line between benignand invasive? An examination of psychological barriers to theacceptance of awareness monitoring systems. J. Organ. Behav.23(5):605–633.

Dow

nloa

ded

from

info

rms.

org

by [

130.

203.

152.

80]

on 1

9 Fe

brua

ry 2

014,

at 0

6:34

. Fo

r pe

rson

al u

se o

nly,

all

righ

ts r

eser

ved.