investigating the importance of trust on adopting an...

12
Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=hihc20 International Journal of Human-Computer Interaction ISSN: 1044-7318 (Print) 1532-7590 (Online) Journal homepage: http://www.tandfonline.com/loi/hihc20 Investigating the Importance of Trust on Adopting an Autonomous Vehicle Jong Kyu Choi & Yong Gu Ji To cite this article: Jong Kyu Choi & Yong Gu Ji (2015) Investigating the Importance of Trust on Adopting an Autonomous Vehicle, International Journal of Human-Computer Interaction, 31:10, 692-702, DOI: 10.1080/10447318.2015.1070549 To link to this article: https://doi.org/10.1080/10447318.2015.1070549 Accepted author version posted online: 09 Jul 2015. Published online: 09 Jul 2015. Submit your article to this journal Article views: 954 View related articles View Crossmark data Citing articles: 9 View citing articles

Upload: others

Post on 31-Dec-2019

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Investigating the Importance of Trust on Adopting an ...interaction.yonsei.ac.kr/wp-content/uploads/2017/... · have applied TAM to explain the adoption and utilization of driving

Full Terms & Conditions of access and use can be found athttp://www.tandfonline.com/action/journalInformation?journalCode=hihc20

International Journal of Human-Computer Interaction

ISSN: 1044-7318 (Print) 1532-7590 (Online) Journal homepage: http://www.tandfonline.com/loi/hihc20

Investigating the Importance of Trust on Adoptingan Autonomous Vehicle

Jong Kyu Choi & Yong Gu Ji

To cite this article: Jong Kyu Choi & Yong Gu Ji (2015) Investigating the Importance of Trust onAdopting an Autonomous Vehicle, International Journal of Human-Computer Interaction, 31:10,692-702, DOI: 10.1080/10447318.2015.1070549

To link to this article: https://doi.org/10.1080/10447318.2015.1070549

Accepted author version posted online: 09Jul 2015.Published online: 09 Jul 2015.

Submit your article to this journal

Article views: 954

View related articles

View Crossmark data

Citing articles: 9 View citing articles

Page 2: Investigating the Importance of Trust on Adopting an ...interaction.yonsei.ac.kr/wp-content/uploads/2017/... · have applied TAM to explain the adoption and utilization of driving

Intl. Journal of Human–Computer Interaction, 31: 692–702, 2015Copyright © Taylor & Francis Group, LLCISSN: 1044-7318 print / 1532-7590 onlineDOI: 10.1080/10447318.2015.1070549

Investigating the Importance of Trust on Adopting anAutonomous Vehicle

Jong Kyu Choi and Yong Gu JiDepartment of Information and Industrial Engineering, Yonsei University, Seoul, Korea

The objective of this study is to examine the user’s adoptionaspects of autonomous vehicle, as well as to investigate what factorsdrive people to trust an autonomous vehicle. A model explainingthe impact of different factors on autonomous vehicles’ intentionis developed based on the technology acceptance model and trusttheory. A survey of 552 drivers was conducted and the results wereanalyzed using partial least squares. The results demonstrated thatperceived usefulness and trust are major important determinantsof intention to use autonomous vehicles. The results also show thatthree constructs—system transparency, technical competence, andsituation management—have a positive effect on trust. The studyidentified that trust has a negative effect on perceived risk. Amongthe driving-related personality traits, locus of control has signifi-cant effects on behavioral intention, whereas sensation seeking didnot. This study investigated that the developed model explains thefactors that influence the acceptance of autonomous vehicle. Theresults of this study provide evidence on the importance of trust inthe user’s acceptance of an autonomous vehicle.

1. INTRODUCTIONRecent advancements in passive and active safety tech-

nologies have contributed remarkably to a notable reductionin traffic fatalities. However, traffic accidents caused 1.3 mil-lion fatalities and 50 million injuries in 2010 (Feypell &Scheunemann, 2012). According to the National HighwayTraffic Safety Administration (NHTSA; Johnson, 2013), humanerror accounts for 93% of traffic accidents. To achieve fur-ther enhancements in safety, the automobile industry has longinvested in developing autonomous vehicles that make decisionsand minimize human intervention.

Autonomous vehicles could improve safety, efficiency, andmobility by taking the driver out of the loop and relying onthe vehicle to navigate itself through traffic (Beiker, 2012).Of importance, autonomous vehicles can help elderly peoplekeep an active lifestyle. According to West et al. (2003), elderlydrivers tend to keep away from tough driving situations and

Address correspondence to Yong Gu Ji, Department ofInformation and Industrial Engineering, Yonsei University,262 Seongsanno, Seodaemun-gu, Seoul, 120-749, Korea. E-mail:[email protected]

drive less overall. Elderly drivers generally drive less than otherdrivers on the road because they have lower levels of cogni-tive and visual functions resulting from disability (Stutts, 1998).Consequently, autonomous vehicles can help elderly driversovercome difficult driving situations and maintain their socialrelationships.

NHTSA classifies autonomous vehicles according to a scalethat ranges from 0 to 4. No automation is given, and thedriver has complete control at Level 0. Single functions areautonomous in Level 1. Level 2 involves automation of at leasttwo primary control functions. It means partially autonomousvehicle. At Level 3, the driver may cede full control to theautonomous vehicle for a period of time. The vehicle in Level4 acts on a fully autonomous level, performing all safety-criticaldriving functions for an entire trip. Previous research of trust onuser adoption of an autonomous vehicle focused on a partiallyautonomous vehicle (Ghazizadeh, Peng, Lee, & Boyle, 2012).Because highly autonomous vehicles are not commercializedyet, it is important to examine drivers’ attitudes toward highlyautonomous vehicles.

The importance of trust has been shown in differentdomains, especially in the adoption of new technologies (Gefen,Karahanna, & Straub, 2003; Gefen and Straub, 2000). It has alsobeen shown in many studies on automation that trust is a majordeterminant of acceptance of automation (Carter & Bélanger,2005; Gefen et al., 2003; Lee & Moray, 1992, 1994; Lee & See,2004; Parasuraman, Sheridan, & Wickens, 2008; Pavlou, 2003).In addition, many researchers have called for insights on factorsthat build trust in order to accomplish a better understand-ing of trust (Leimeister, Ebner, & Krcmar, 2005). Althoughthe importance of the concept of trust between humans andmachines has been stated in much of the research, it has yetto be systematically studied in autonomous vehicle domain.

The main purpose of this study is to predict the user’sadoption aspects of autonomous vehicles, as well as to inves-tigate what factors drive people to trust autonomous vehicles.To accomplish this, we suggest the research model and influ-encing factors based on a theoretical background. The study isorganized as follows: Section 2 provides literature reviews forthe research model, section 3 presents our research model and

692

Page 3: Investigating the Importance of Trust on Adopting an ...interaction.yonsei.ac.kr/wp-content/uploads/2017/... · have applied TAM to explain the adoption and utilization of driving

TRUST IN ADOPTING AN AUTONOMOUS VEHICLE 693

hypotheses, section 4 explains the methodology, and section 5gives an explanation of the results. We close with a discussionand conclusion in section 6.

2. LITERATURE REVIEW

2.1. Technology Acceptance ModelThe technology acceptance model (TAM) is an information

systems theory that models how users come to accept and usea technology. TAM explains how people’s beliefs and attitudesare related to their intention to perform a behavior. TAM positsthat two beliefs, perceived usefulness and perceived ease of use,determine an intention to use a technology. Since the intro-duction of this model, numerous empirical studies have shownthat TAM is a parsimonious and robust model of technologyacceptance behaviors in a wide variety of information systems(Davis, Bagozzi, & Warshaw, 1989; Davis & Venkatesh, 1996;Venkatesh & Davis, 2000).

TAM posits that a user’s intention to use technology is deter-mined by usefulness and ease of use. In previous TAM research,intentions mediate the effects of other potential antecedents ofactual usage behavior (Davis et al., 1989; Taylor & Todd, 1995;Venkatesh, Morris, Davis, & Davis, 2003). Using behavioralintention as the dependent variable instead of actual usage isparticularly useful to examine the acceptance of technologi-cal systems at an early stage (Chau & Hu, 2002; Sheppard,Hartwick, & Warshaw, 1988; Wu, Shen, Lin, Greenes, & Bates,2008). The intention to use a system is determined by the user’sperceived ease of use for the system and the system’s perceivedusefulness. Perceived usefulness refers to “the degree to which aperson believes that using a particular system would enhance hisor her job performance” (Davis et al., 1989, p. 320). Perceivedusefulness consistently stands out as the main driver of technol-ogy adoption (King & He, 2006; Ma & Liu, 2004; Schepers &Wetzels, 2007). Perceived ease of use is defined as “the degreeto which a person believes that using a particular system wouldbe free of effort” (Davis et al., 1989, p. 320). Several studieshave applied TAM to explain the adoption and utilization ofdriving assistance systems. Perceived usefulness and perceivedease of use have also been proposed as a major determinant oftechnical acceptance of autonomous vehicles.

2.2. TrustTrust is particularly important for understanding human

automation partnerships. Several researchers argued that just astrust mediates relationships between humans, it may also medi-ate the relationship between human and automation (Sheridan,1975; Sheridan & Hennessy, 1984). Trust is known to be amajor determinant of reliance on and acceptance of automa-tion, standing between people’s beliefs toward automation andtheir intention to use it (Carter & Bélanger, 2005; Gefen,Karahanna, & Straub, 2003; Lee & Moray, 1992, 1994; Lee &See, 2004; Parasuraman et al., 2008; Pavlou, 2003). According

to Ghazizadeh et al. (2012), they advocated the consideration oftrust factor to explain the individual acceptance of driving assis-tance systems. Autonomous vehicles are intelligent vehicles inwhich steering, deceleration, and acceleration are completelycontrolled by the built-in automated system. Thus, we haveinvestigated various factors that affect trust through existingliterature on automation.

In behavioral literature, most researchers adopt Mayer,Davis, and Schoorman’s (1995) three dimensions—ability,benevolence, and integrity—to assess trust. Thatcher,McKnight, Baker, Arsal, and Roberts (2011) argued thattrust has three dimensions, each of which corresponds toan interpersonal trusting belief. According to this definition,functionality refers to the belief that the system has the capa-bility, functions, or features to perform essential functions.Helpfulness refers to the belief that a system will provideadequate and responsive aid, whereas predictability refers tothe belief that the system acts consistently and its behaviorcan be forecast. Functionality is similar to interpersonaltrust’s competence belief. Predictability construct is similarto interpersonal trust’s integrity belief. Helpfulness is similarto interpersonal trust’s benevolence belief. Similarly, Hasan,Krischkowsky, and Tscheligi (2012) suggested that trust canbe evaluated by assessing functionality, helpfulness, andreliability. Functionality represents the user’s expectation of thetrustee’s capability. Helpfulness represents the user’s beliefsthat the technology provides adequate, effective, and responsiveassistance. Reliability refers to whether trustees are consistent,predictable, or reliable in performance. In another study,the technology’s predictability, reliability, and utility wereused for evaluating trust of information systems technology(Lippert, 2001). Technology predictability is an individual’sexpectation of the technology’s consistency of performancebased on past experiences and future expectations. Technologyreliability is an individual’s confidence that the technology willperform consistently in situations that involve some degree ofdependence and risk. Technology utility is an individual’s faith,perception, and assessment of the usefulness of the technology.

Lee and Moray (1992) confirmed influence trust in automa-tion. They identified the general bases of trust: performance,process, and purpose. Performance refers to the current and his-torical operation of the automation and includes characteristicssuch as reliability, predictability, and ability. Performance infor-mation describes what the automation does. More specifically,performance refers to the competency or expertise as demon-strated by its ability to achieve the operator’s goals. Process isthe degree to which the automation’s algorithms are appropri-ate for the situation and able to achieve the operator’s goals.Process information describes how the automation operates.In interpersonal relationships, this corresponds to the consis-tency of actions associated with adherence to a set of acceptableprinciples (Mayer et al., 1995). Purpose refers to the degreeto which the automation is being used within the realm ofthe designer’s intent. Purpose describes why the automation

Page 4: Investigating the Importance of Trust on Adopting an ...interaction.yonsei.ac.kr/wp-content/uploads/2017/... · have applied TAM to explain the adoption and utilization of driving

694 J. K. CHOI AND Y. G. JI

was developed. With interpersonal relationships, the percep-tion of such a positive orientation depends on the intentionsand motives of the trustee. Söllner, Hoffmann, Hoffmann, andLeimeister (2011) also used the three dimensions of Lee andMoray (1992) in information technology artifacts research.

As just discussed, many researchers have argued that trusthas three dimensions, each of which corresponds to an interper-sonal trusting belief. One dimension refers to the belief that thesystem is predictable and understandable. Another dimensionrefers to the belief that the system performs tasks accurately andcorrectly. The third dimension refers to the belief that the systemprovides adequate, effective, and responsive assistance. Thus,we propose three dimensions for trust in an autonomous vehi-cle: system transparency, technical competence, and situationmanagement. The degree to which users can predict and under-stand the operating of autonomous vehicles is referred to assystem transparency. Technical competence refers to the degreeof user perception on the performance of the autonomous vehi-cles. Situation management refers to the user’s belief that he orshe can recover control in a situation whenever desired.

2.3. Perceived RiskThe literature reveals the importance of perceived risk as

a key component of trust models (Berry, 1995; Mayer et al.,1995). In the several researches (Mayer et al., 1995; Mitchell,1999), perceived risk referred to the perceived uncertainty ina given situation. In consumer behavior research, it has beenrelated to the expectation of experiencing losses in uncertainsituations (Featherman & Pavlou, 2003; Peter & Ryan, 1976).Perceived risk is a major factor linked to trust, particularly withregard to the decision to use an automated device, or not touse it (Numan, 1998; Pavlou, 2003). Their results confirm theinfluence of trust on automation, when trust is considered bothas a direct determinant of behavioral intention and as an indi-rect influence through perceived usefulness and perceived risk.According to Pavlou (2003), trust reduces the perceived risk.Perceived risk depends on the expected probability of a nega-tive situation (Numan, 1998). When drivers trust autonomousvehicles, they assume that the vehicles will behave as expected,reducing the perceived risk of a negative situation. Other stud-ies have used the TAM constructs in assessing user adoption ofdriving assistance systems, showing that perceived risk influ-ences strongly affect the intention to use a system (Adell, 2010,Meschtscherjakov, Wilfinger, Scherndl, & Tscheligi, 2009).

2.4. Personality TraitsLocus of control and sensation seeking have an influence

on driving behavior when drivers are using driving assistancesystems (Rudin-Brown & Ian Noy, 2002; Rudin-Brown &Parker, 2004; Stanton & Marsden, 1996; Ward, Fairclough, &Humphreys, 1995). Locus of control relates to a personality traitthat echoes the extent to which a person thinks to be in controlof eternal events that affect him/her (Rotter, 1966). Someone

with an internal locus of control believes he or she can controlevents, whereas those who do not believe so have an externallocus of control. Several researchers have suggested that inter-nals perform better than externals (Krause & Stryker, 1984;Parkes, 1984; Rotter, 1966). People with an external locus ofcontrol tend to believe that humans will always cause accidentsand hence that an automated driving system would be far bet-ter than human drivers. Stanton and Young (2005) stated that anexternal locus of control might lead an individual to assume apassive role with the automated system. Also, it might be eas-ier for external drivers to rely on an automated driving system,considering that they rely less on their own driving skills than dointernals (Rudin-Brown & Ian Noy, 2002). This might explainthat externals are expected to prefer autonomous vehicles.

Sensation seeking is the tendency to seek novel, varied,complex, and intense sensations and experiences and the will-ingness to take risks for the sake of such experiences (Payreet al., 2014; Zuckerman, 1994). It correlates with a multitudeof risky behaviors such as gambling, smoking, and risky driv-ing, including speeding and driving while intoxicated (Jonah,1997). In driving assistance systems research, it has been shownthat high-sensation seekers tend to drive, on average, fasterand less carefully (Burns & Wilde, 1995), with short distancesbetween vehicles and strong braking (Payre et al., 2014). Rudin-Brown and Parker (2004) stated that high sensation seekersmay be more likely than low sensation seekers to demonstratebehavioral adaptation of driving assistance systems.

3. RESEARCH MODELTo examine users’ adoption of autonomous vehicles, we

needed to extend the TAM model. Synthesizing prior researchon TAM and research on trust in automation, we developed aresearch model. We incorporated additional external factors toextend the TAM model. There are 10 constructs in our model:perceived usefulness, perceived ease of use, trust, perceivedrisk, system transparency, technical competence, situation man-agement, locus of control, sensation seeking as interveningvariables, and behavioral intention as the dependent variable.The constructs of trust are distributed among three second-levelconstructs: system transparency, technical competence, and sit-uation management. We tested the strength of the hypothesizedrelationships in our model and the robustness of the modelin predicting behavioral intention to use autonomous vehicles.Figure 1 illustrates the proposed research model in this study.Twelve hypotheses are proposed based on literature reviews.

As previously discussed, many studies have demonstratedthat perceived usefulness and perceived ease of use positivelyinfluence behavioral intention (King & He, 2006; Ma & Liu,2004; Schepers & Wetzels, 2007), even though the significancelevel differed between the results. In addition, the effect ofperceived ease of use on behavioral intention is mediated byperceived usefulness (King & He, 2006; Ma & Liu, 2004).It thus seems reasonable to hypothesize that there is a positive

Page 5: Investigating the Importance of Trust on Adopting an ...interaction.yonsei.ac.kr/wp-content/uploads/2017/... · have applied TAM to explain the adoption and utilization of driving

TRUST IN ADOPTING AN AUTONOMOUS VEHICLE 695

FIG. 1. Research model.

correlation between perceived usefulness and ease of use andbehavioral intention to adopt autonomous vehicles. Therefore,the following hypotheses were proposed:

H1: Perceived usefulness has a positive effect on behavioralintention.

H2: Perceived ease of use has a positive effect on behavioralintention.

H3: Perceived ease of use has a positive effect on usefulness.

Previous studies have confirmed that trust is a major con-struct for predicting the adoption of automation (Carter &Bélanger, 2005; Gefen et al., 2003; Lee & Moray, 1992, 1994;Lee & See, 2004; Parasuraman et al., 2008; Pavlou, 2003).Some researchers have confirmed the influence of trust whentrust is considered both as a direct determinant of behavioralintention and as an indirect influence through perceived useful-ness and perceived risk (Carter & Bélanger 2005; Gefen et al.,2003; Pavlou, 2003). Also, several studies have found that trustnegatively influences perceived risk (Doney & Cannon, 1997;Pavlou, 2003). Therefore, these three hypotheses have beenproposed:

H4: Trust has a positive effect on behavioral intention.H5: Trust has a positive effect on perceived usefulness.H6: Trust has a negative effect on perceived risk.

According to reviewing previous literature, three trust con-structs were elicited. Further, many studies have found that auser’s belief about automation positively influences trust (Hasanet al., 2012; Lippert, 2001; Madsen & Gregor, 2000; Muir,1987). System operation transparency makes it possible for auser to create an accurate mental model of a system’s opera-tion and capabilities (Kieras & Bovair, 1984; Norman, 2002).Also, it is important to understand the mental model of the sys-tem’s operation, such that the understanding will increase trust

in autonomous vehicles. In several studies on trust of the driv-ing assistance systems, it has been shown that trust depends onthe perceived performance and reliability by users (Maltz, Sun,Wu, & Mourant, 2004; Moray, Inagaki, & Itoh, 2000; Riley,1994). In autonomous vehicles, expectation of the system’s per-formance possibly will increase trust. Autonomous vehicles’behaviors can lead to a sense of loss of user control. Responsiveaid has been shown to affect user attitudes toward user trust onautomation: If the system allows drivers to take over the vehi-cle whenever they want to, it will be easier for them to trustautonomous vehicles. On that presumption, we hypothesize thefollowing:

H7: System transparency has a positive effect on trust.H8: Technical competence has a positive effect on trust.H9: Situation management has a positive effect on trust.

Perceived risk is a major construct for predicting behavioralintention in various information technology studies. Further,studies have confirmed that perceived risk negatively influencesbehavioral intention (Jarvenpaa, Tractinsky, & Saarinen, 1999;Ratnasingham and Kumar, 2000). In the context of autonomousvehicle usage, it is also expected that perceived risk negativelyinfluences behavioral intention. Therefore, we hypothesize:

H10: Perceived risk has a negative effect on behavioralintention.

According to previous literature (Rudin-Brown & Ian Noy,2002; Stanton & Marsden, 1996; Ward et al., 1995), people withan external locus of control tend to believe they cannot con-trol external events that affect them. Consequently, they maybecome overreliant on the system and are more likely to betempted to give up supervising, as they think that they are nolonger responsible for driving because the automated driving

Page 6: Investigating the Importance of Trust on Adopting an ...interaction.yonsei.ac.kr/wp-content/uploads/2017/... · have applied TAM to explain the adoption and utilization of driving

696 J. K. CHOI AND Y. G. JI

system controls vehicle operation. Thus, externals are expectedto prefer autonomous vehicles more than internals. As previ-ously mentioned, delegating control to an autonomous vehicleis expected to lower the thrill and sensory experience of driv-ing. High sensation seekers are more likely than low sensationseekers to demonstrate behavioral adaptation of driving assis-tance systems (Rudin-Brown & Parker, 2004). Therefore, it isreasonable to formulate the following hypotheses:

H11: External locus of control has a positive effect on behav-ioral intention.

H12: Sensation seeking has a positive effect on behavioralintention.

4. METHOD

4.1. Measurement DevelopmentThe 10 constructs measured in our study were perceived

usefulness, perceived ease of use, trust, perceived risk, sys-tem transparency, technical competence, situation management,locus of control, sensation seeking, and behavioral intention.Each construct was measured with multiple items, and all itemswere adapted from extant literature to improve the contentvalidity (Table 1). To measure perceived usefulness, perceivedusefulness, and behavioral intention, we used scale items fromDavis (1989). Items used to measure trust were adapted fromGefen et al. (2003) and Pavlou (2003). System transparency,technical competence, and situation management were assessedusing measures adapted from previous studies on automation(Hasan et al., 2012; Lippert, 2001; Madsen & Gregor, 2000;Thatcher et al., 2011). Items for perceived risk were adaptedfrom Pavlou (2002) and Ratnasingham and Kumar (2000). Thescale of locus of control was measured by the items obtainedfrom Rudin-Brown and Ian Noy (2002). Further, the scales ofsensation seeking were measured by items derived from Payreet al., (2014). All the items of constructs were measured on a7-point Likert scale from 1 (strongly disagree) to 7 (stronglyagree).

4.2. Participants and Data CollectionThe data are collected from online questionnaires filled out

by drivers. In this questionnaire, participants provided theirdemographic information and responded to 30 items on the10 constructs. The data from 635 respondents were collected.We eliminated the respondents who had missing data in any ofthe survey’s items. As a result, 83 respondents were excluded,and 552 respondents were selected. Among them, 69.9% weremale users and 30.1% were female. Approximately 31.9% ofusers were younger than 30 years of age, 56.2% were 30 to 39,7.6% were 40 to 49, 2.5% were 50 to 59, and 1.8% were olderthan 59. With respect to driving experience, 5.1%, 17.4%, and77.5% drove less than 2,000 km, 2,000 km to 20,000 km, andmore than 20,000 km, respectively.

4.3. Data AnalysisThe proposed model and hypothesis testing is conducted

using PLS analysis with SmartPLS 3.0. PLS is a constant tool toverify the validity of the constructs and evaluation of the struc-tural relationship among constructs in latent variables analysis(Chin, 1998; Gefen et al., 2000). It is appropriate to use PLSfor analyzing for the following reasons (Bacon, 1999; Chin,Marcolin, & Newsted, 2003; Fornell & Bookstein, 1982). First,the PLS method provides some advantages to validate the theo-ries at an early stage. Second, the PLS does not require data tofollow a strict normal distribution. Third, measurement modeland structural model assessment allow a combination of factoranalysis and hypotheses testing to be combined in one operation(Gefen et al., 2000).

The results were analyzed in two stages. In the first stage, weassessed the reliability and validity of the measurement model.Then, we focused on hypothesis testing and analysis. Pathsignificance was estimated using a bootstrapping resamplingmethod with 500 subsamples.

5. RESULTS

5.1. Data Analysis of the Measurement ModelThe measurement model should be assessed before the struc-

tural model is examined. The measurement model can beassessed based on internal consistency, convergent validity, anddiscriminant validity (Barclay, Higgins, & Thompson, 1995).Cronbach’s alpha is used to validate internal consistency, and0.7 or higher is recommended (Barclay et al., 1995). Itemloadings are recommended to exceed 0.6 (Hair, Black, Babin,Anderson, & Tatham, 2006). The composite reliability valuesare recommended to exceed 0.7. The average variance extracted(AVE) value for each latent variables should exceed 0.5, and thesquare root of the AVE should be greater than the interconstructcorrelations (Barclay et al., 1995; Chin, 1998).

As shown in Table 1, all item loadings exceed 0.6. All AVEsare larger than 0.5. The composite reliability and Cronbach’salpha exceed 0.7. These results show a good reliability andconvergent validity (Bagozzi & Yi, 1988; Gefen et al., 2000).

To assess discriminant validity, we compared the square rootof AVE for each factor to its correlations with other factors.As we can see from Table 2, the square root of AVE for eachfactor is obviously larger than its correlation coefficients withother factors. Thus, the scale has a good discriminant validity(Fornell & Larcker, 1981; Gefen et al., 2000). In addition, thecorrelation value between latent variables are lower than 0.7, somulticollinearity issues were avoided. Thus, the measurementmodel was proven to be reliable and valid for the study.

5.2. Data Analysis of the Structural ModelThe structural model comprises the hypothesized

relationship between latent constructs in the researchmodel. By using Bootstrap or Jackknife sampling, we

Page 7: Investigating the Importance of Trust on Adopting an ...interaction.yonsei.ac.kr/wp-content/uploads/2017/... · have applied TAM to explain the adoption and utilization of driving

TRUST IN ADOPTING AN AUTONOMOUS VEHICLE 697

TABLE 1List of Constructs and Their Items

Construct Item

Behavioral intention BI1 I intend to use autonomous vehicle in the future.BI2 I expect that I would use autonomous vehicle in the future.BI3 I plan to use autonomous vehicle in the future.

Perceived usefulness PU1 Using autonomous vehicle will increase my productivity.PU2 Using autonomous vehicle will increase my driving performance.PU3 Using autonomous vehicle would enhance my effectiveness while driving.

Perceived ease of use PEOU1 Learning to operate autonomous vehicle would be easy for me.PEOU2 I would find it easy to get autonomous vehicle to do what I want to do.PEOU3 Interacting with autonomous vehicle would not require a lot of my mental effort.

Trust TRU1 Autonomous vehicle is dependable.TRU2 Autonomous vehicle is reliable.TRU3 Overall, I can trust autonomous vehicle.

System transparency ST1 I believe that autonomous vehicle acts consistently and its behavior can beforecast.

ST2 I believe that I can form a mental model and predict future behavior ofautonomous vehicle.

ST3 I believe that I can predict what autonomous vehicle will act in a particular way.Technical competence TC1 I believe that autonomous vehicle is free of error.

TC2 I believe that I can depend and rely on autonomous vehicle.TC3 I believe that autonomous vehicle will consistently perform under a variety of

circumstance.Situation management SM1 I believe that autonomous vehicle provides alternative solutions.

SM2 I believe that I can control the behavior of autonomous vehicle.SM3 I believe that autonomous vehicle will provide adequate, effective, and

responsive help.Perceived risk PR1 Autonomous vehicle would lead to a financial loss for me.

PR2 Autonomous vehicle might not perform well and create problems.PR3 Using autonomous vehicle would be risky.

External locus of control ELOC1 Driving without accidents is mainly a matter of luck.ELOC2 Accidents usually happen because of unexpected events that occur during

driving.ELOC3 It is difficult to prevent accidents when the driving conditions are difficult, such

as darkness, rain, a narrow road with many turns.Sensation seeking SS1 I would like to drive without a preplanned route and without a schedule.

SS2 I think I would enjoy the experience of driving very fast on a steep road.SS3 I do not have patience for people who drive cars.

can obtain the path coefficient and its corresponding tvalue. With these values, we can assess statistical conclu-sion validity by testing the null hypothesis for each pathcoefficient.

The explanatory power of the estimated model, or nomo-logical validity, can be assessed by observing the R2

of endogenous constructs. Figure 2 shows the R2 val-ues for trust, perceived usefulness, and behavioral inten-tion that is 0.474, 0.340, and 0.676, respectively. Falk andMiller (1992) recommended that R2 should be at least

0.10 in order for the latent construct to be judged asadequate.

Table 3 lists all path coefficients and their significance.A t test was conducted to test the significance of path coef-ficients based on significance level .05. All hypotheses aresupported except H3, H10, and H12 (Table 4). Trust stronglyaffects perceived usefulness, and both factors determine behav-ioral intention. Among the two driving-related personality traits,locus of control has significant effects on behavioral intention,whereas sensation seeking did not.

Page 8: Investigating the Importance of Trust on Adopting an ...interaction.yonsei.ac.kr/wp-content/uploads/2017/... · have applied TAM to explain the adoption and utilization of driving

698 J. K. CHOI AND Y. G. JI

TABLE 2Scales for Reliability and Convergent Validity

Construct Item M SD Loading α CR AVE

BI BI1 4.75 1.56 0.894 0.924 0.952 0.868BI2 4.62 1.66 0.956BI3 4.82 1.61 0.945

PU PU1 5.19 1.30 0.841 0.778 0.870 0.692PU2 4.95 1.72 0.744PU3 5.38 1.34 0.903

PEOU PEOU1 5.04 1.28 0.916 0.884 0.928 0.811PEOU2 5.00 1.32 0.908PEOU3 4.96 1.35 0.876

TRU TRU1 3.81 1.35 0.811 0.738 0.849 0.652TRU2 4.82 1.58 0.764TRU3 3.77 1.14 0.845

ST ST1 4.65 1.47 0.880 0.847 0.908 0.766ST2 4.71 1.41 0.865ST3 5.02 1.39 0.882

TC TC1 3.08 1.46 0.847 0.830 0.896 0.743TC2 2.47 1.35 0.867TC3 2.49 1.44 0.872

SM SM1 5.16 1.41 0.887 0.865 0.917 0.787SM2 5.04 1.32 0.895SM3 5.18 1.40 0.879

PR PR1 4.45 1.26 0.71 0.705 0.826 0.615PR2 5.40 1.43 0.751PR3 5.79 1.16 0.881

ELOC ELOC1 4.41 1.44 0.786 0.726 0.844 0.644ELOC2 4.33 1.59 0.800ELOC3 4.31 1.54 0.820

SS SS1 4.26 2.18 0.905 0.867 0.912 0.775SS2 3.65 1.71 0.830SS3 3.62 1.96 0.904

Note: α = Cronbach’s alpha; CR = composite reliability; AVE = average variance extracted; BI = behavioral intention; PU = perceivedusefulness; PEOU = perceived ease of use; TRU = trust; ST = system transparency; TC = technical competence; DC = decision control;PR = perceived risk; ELOC = external locus of control; SS = sensation seeking.

6. DISCUSSION AND CONCLUSIONThis research was conducted to understand the adoption of

autonomous vehicles at an early stage. To predict the user’sadoption aspects of autonomous vehicles, we developed theadoption model of autonomous vehicles based on TAM andtrust theory. Perceived risk factor and driving-related personal-ity factors are suggested to extend understanding user accep-tance. A survey method was conducted to collect the user’sopinion. The research model was statistically tested by usingPLS. The results show that the exception of three hypotheses,most hypotheses are supported by the data.

First, perceived usefulness had more influence on behavioralintention than did perceived ease of use in the TAM construct.In line with several studies, moreover, perceived usefulness didnot mediate the relationship between perceived ease of use and

behavioral intention. In contrast, the result reveals very weakeffects of perceived ease of use on behavioral intention. Thisresult is in line with recent meta-analytic results (King & He,2006; Ma & Liu, 2004). Their research revealed very weakeffects of perceived ease of use on usage intentions. Throughthe results, we demonstrated the drivers’ expectations of theautonomous vehicles. Their intention to use or not to use theautonomous vehicles relies on how useful it is rather than howeasy it is to use. They are already familiar with driving a vehicle,so it is not that hard to use autonomous vehicles.

Second, trust exhibited strong direct effects on perceivedusefulness and behavioral intention. These results are in linewith existing adoption studies in information technologies(Carter & Bélanger, 2005; Gefen et al., 2003; Lee & Moray,1992, 1994; Lee & See, 2004; Parasuraman et al., 2008; Pavlou,

Page 9: Investigating the Importance of Trust on Adopting an ...interaction.yonsei.ac.kr/wp-content/uploads/2017/... · have applied TAM to explain the adoption and utilization of driving

TRUST IN ADOPTING AN AUTONOMOUS VEHICLE 699

FIG. 2. Assessment of the structural model. Note: ∗p < .05. ∗∗p < .01. ∗∗∗p < .001.

TABLE 3Correlation Matrix and Discriminant Validity

Construct 1 2 3 4 5 6 7 8 9 10

BI 0.932PU 0.719 0.832PEOU 0.274 0.208 0.9TRU 0.735 0.582 0.258 0.807ST 0.512 0.543 0.236 0.479 0.875TC 0.354 0.315 0.153 0.486 0.396 0.862DC 0.436 0.364 0.204 0.549 0.303 0.268 0.887PR −0.14 −0.04 0.027 −0.17 −0.037 −0.207 −0.018 0.784ELOC 0.263 0.217 0.121 0.228 0.081 0.135 0.121 −0.067 0.803SS 0.124 0.052 0.107 0.135 0.093 0.07 0.113 −0.014 −0.011 0.88

Note: Scores in bold represent the square root of average variance extracted for a construct. BI = behavioral intention; PU = perceivedusefulness; PEOU = perceived ease of use; TRU = trust; ST = system transparency; TC = technical competence; DC = decision control;PR = perceived risk; ELOC = external locus of control; SS = sensation seeking.

2003). This indicates that trust is a major construct for predict-ing the adoption of autonomous vehicles. In the case of trustconstruct, 47.4% of variance was explained by system trans-parency, technical competence, and situation management. Allthese factors had significant effects on trust. This results showsthat the three dimensions are suited for researching trust inautonomous vehicles. Also, this demonstrates that it is impor-tant to improve the user perception on the accuracy of theautonomous technologies and provide information that helpdrivers predict and understand the operating of the autonomousvehicles. This further demonstrates the importance of provid-ing functions that allow drivers to recover control in situationswhenever they so desire. This finding is particularly importantfor car manufacturers when they design autonomous vehicle.

Third, contrary to other studies’ findings, perceived risk isnot a significant factor to predict behavioral intention. As theprevious literature, it is expected that perceived risk would

lower behavioral intention. However, we identified that trust hasa negative effect on perceived risk. It may be explain that trustedautonomous vehicles can be expected to reduce environmentaluncertainty and related risks.

Fourth, locus of control and sensation seeking were selectedas driving-related personality factors. External locus of con-trol significantly influenced behavior. This result demonstratedthat someone who experiences difficulty in driving, such asolder adults, has greater intention to use the autonomous vehi-cles. In several previous studies, the researchers documentedthat sensation seeking significantly influenced behavioral inten-tion. However, present result shows that sensation seeking isnot a significant antecedent of behavioral intention. In otherwords, drivers expect a more novel experience than the thrilland sensory experience of driving in autonomous vehicles. Forthis reason, it is necessary to investigate user needs for novelcontents in autonomous vehicles.

Page 10: Investigating the Importance of Trust on Adopting an ...interaction.yonsei.ac.kr/wp-content/uploads/2017/... · have applied TAM to explain the adoption and utilization of driving

700 J. K. CHOI AND Y. G. JI

TABLE 4Structural Model Results

Hypothesis Path b t Value Significance Support or Not

H1 PU → BI 0.436 13.708 ∗∗∗ SupportedH2 PEOU → BI 0.063 2.487 ∗ SupportedH3 PEOU → PU 0.063 1.778 0.076 Not supportedH4 TRU → BI 0.436 14.521 ∗∗∗ SupportedH5 TRU → PU 0.564 18.379 ∗∗∗ SupportedH6 TRU → PR −0.17 3.812 ∗∗∗ SupportedH7 ST → TRU 0.246 6.446 ∗∗∗ SupportedH8 TC → TRU 0.284 8.432 ∗∗∗ SupportedH9 DC → TRU 0.398 10.297 ∗∗∗ SupportedH10 PR → BI −0.045 1.767 0.078 Not supportedH11 ELOC → BI 0.06 2.377 ∗∗ SupportedH12 SS → BI 0.036 1.431 0.153 Not supported

Note: BI = behavioral intention; PU = perceived usefulness; PEOU = perceived ease of use; TRU = trust; ST = system transparency; TC =technical competence; DC = decision control; PR = perceived risk; ELOC = external locus of control; SS = sensation seeking.

∗p < .05. ∗∗p < .01. ∗∗∗p < .001.

Last, this research has some limitations. First, the respon-dents were not perfectly controlled: The gender and drivingexperience ratio was not balanced. The number of male respon-dents was relatively higher than female respondents. Also,age distribution of respondents was not balanced. The num-ber of respondents from 20 to 39 years of age is relativelyhigher than the number of other respondents. Thus, the resultscould be biased toward younger men’s opinions. Second, themodel clearly does not include all relevant variables. Futureresearches should test the possible inclusion of other exter-nal variables (e.g., personality characteristics). Therefore, moreresearches are needed to validate, expand, and generalize theseresults.

REFERENCESAdell, E. (2010). Acceptance of driver support systems. Proceedings of the

European Conference on Human Centred Design for Intelligent TransportSystems, 475–486.

Bacon, L. D. (1999, February). Using LISREL and PLS to measure customersatisfaction. Paper presented at the Seventh Annual Sawtooth SoftwareConference, La Jolla, CA.

Bagozzi, R. P., & Yi, Y. (1988). On the evaluation of structural equation models.Journal of the Academy of Marketing Science, 16, 74–94.

Barclay, D., Higgins, C., & Thompson, R. (1995). The partial least squares(PLS) approach to causal modeling: Personal computer adoption and useas an illustration. Technology Studies, 2, 285–309.

Beiker, S. A. (2012). Legal aspects of autonomous driving. Santa Clara LawReview, 52 (4),Article 1.

Berry, L. L. (1995). Relationship marketing of services—Growing interest,emerging perspectives. Journal of the Academy of Marketing Science, 23,236–245.

Burns, P. C., & Wilde, G. J. (1995). Risk taking in male taxi drivers:Relationships among personality, observational data and driver records.Personality and Individual Differences, 18, 267–278.

Carter, L., & Bélanger, F. (2005). The utilization of e-government services:Citizen trust, innovation and acceptance factors. Information SystemsJournal, 15, 5–25.

Chau, P. Y. K., & Hu, P. J. H. (2002). Investigating healthcare professionals’decisions to accept telemedicine technology: An empirical test of competingtheories. Information & Management, 39, 297–311.

Chin, W. W. (1998). The partial least squares approach to structural equationmodeling. Modern Methods for Business Research, 295, 295–336.

Chin, W. W., Marcolin, B. L., & Newsted, P. R. (2003). A partialleast squares latent variable modeling approach for measuring inter-action effects: Results from a Monte Carlo simulation study and anelectronic-mail emotion/adoption study. Information Systems Research, 14,189–217.

Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and useracceptance of information technology. MIS quarterly, 319–340.

Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of com-puter technology: A comparison of two theoretical models. ManagementScience, 35, 982–1003.

Davis, F. D., & Venkatesh, V. (1996). A critical assessment of potential mea-surement biases in the technology acceptance model: Three experiments.International Journal of Human–Computer Studies, 45, 19–45.

Doney, P. M., & Cannon, J. P. (1997). An examination of the nature of trust inbuyer–seller relationships. The Journal of Marketing, 61, 35–51.

Falk, R. F., & Miller, N. B. (1992). A primer for soft modeling. Akron, OH:University of Akron Press.

Featherman, M. S., & Pavlou, P. A. (2003). Predicting e-services adoption:A perceived risk facets perspective. International Journal of Human–Computer Studies, 59, 451–474.

Feypell, V., & Scheunemann, J. (2012). Road deaths: Latest traffic safetydata released. Retrieved from http://internationaltransportforum.org/Press/PDFs/2012-05-02IRTAD.pdf

Fornell, C., & Bookstein, F. L. (1982). Two structural equation models: LISRELand PLS applied to consumer exit-voice theory. Journal of MarketingResearch, 19, 440–452.

Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation modelswith unobservable variables and measurement error. Journal of MarketingResearch, 18, 39–50.

Gefen, D., Karahanna, E., & Straub, D. W. (2003). Trust and TAM in onlineshopping: An integrated model. MIS Quarterly, 27, 51–90.

Gefen, D., & Straub, D. W. (2000). The relative importance of perceived easeof use in IS adoption: a study of e-commerce adoption. Journal of theAssociation for Information Systems, 1, 8.

Gefen, D., Straub, D., & Boudreau, M. C. (2000). Structural equation model-ing and regression: Guidelines for research practice. Communications of theAssociation for Information Systems, 4, 7.

Page 11: Investigating the Importance of Trust on Adopting an ...interaction.yonsei.ac.kr/wp-content/uploads/2017/... · have applied TAM to explain the adoption and utilization of driving

TRUST IN ADOPTING AN AUTONOMOUS VEHICLE 701

Ghazizadeh, M., Peng, Y., Lee, J. D., & Boyle, L. N. (2012, September).Augmenting the technology acceptance model with trust: Commercialdrivers’ attitudes towards monitoring and feedback. Proceedings of theHuman Factors and Ergonomics Society Annual Meeting, 2286–2290.

Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L.(2006). Multivariate data analysis (Vol. 6). Upper Saddle River, NJ: PearsonPrentice Hall.

Hasan, Z., Krischkowsky, A., & Tscheligi, M. (2012). Modelling user-centered-trust (UCT) in software systems: Interplay of trust, affect and acceptancemodel (pp. 92–109). Berlin, Germany: Springer.

Jarvenpaa, S. L., Tractinsky, N., & Saarinen, L. (1999). Consumer trust in anInternet store: A cross-cultural validation. Journal of Computer-MediatedCommunication, 5(2).

Johnson, T. (2013, January 31). Enhancing safety through automation(SAE gov’t-industry meeting, automation and connected vehicle safety).Washington, DC: National Highway Traffic Safety Administration.Available from http://www.sae.org/events/gim/presentations/2013/johnson_tim.pdf

Jonah, B. A. (1997). Sensation seeking and risky driving: A review andsynthesis of the literature. Accident Analysis & Prevention, 29, 651–665.

Kieras, D. E., & Bovair, S. (1984). The role of a mental model in learning tooperate a device. Cognitive Science, 8, 255–273.

King, W. R., & He, J. (2006). A meta-analysis of the technology acceptancemodel. Information & Management, 43, 740–755.

Krause, N., & Stryker, S. (1984). Stress and well-being: The buffering role oflocus of control beliefs. Social Science & Medicine, 18, 783–790.

Lee, J., & Moray, N. (1992). Trust, control strategies and allocation of functionin human–machine systems. Ergonomics, 35, 1243–1270.

Lee, J. D., & Moray, N. (1994). Trust, self-confidence, and operators’ adapta-tion to automation. International Journal of Human–Computer Studies, 40,153–184.

Lee, J. D., & See, K. A. (2004). Trust in automation: Designing for appropriatereliance. Human Factors, 46, 50–80.

Leimeister, J. M., Ebner, W., & Krcmar, H. (2005). Design, implementation,and evaluation of trust-supporting components in virtual communities forpatients. Journal of Management Information Systems, 21, 101–131.

Lippert, S. K. (2001). An exploratory study into the relevance of trust in the con-text of information systems technology (Unpublished doctoral dissertation).Washington, DC: George Washington University.

Ma, Q., & Liu, L. (2004). The technology acceptance model: A meta-analysisof empirical findings. Journal of Organizational and End User Computing,16, 59–72.

Madsen, M., & Gregor, S. (2000, December). Measuring human–computertrust. Proceedings of Eleventh Australasian Conference on InformationSystems, 6–8.

Maltz, M., Sun, H., Wu, Q., & Mourant, R. (2004). In-vehicle alerting sys-tem for older and younger drivers: Does experience count? TransportationResearch Record: Journal of the Transportation Research Board, 1899,64–70.

Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integrative model oforganizational trust. Academy of Management Review, 20, 709–734.

Meschtscherjakov, A., Wilfinger, D., Scherndl, T., & Tscheligi, M. (2009,September). Acceptance of future persuasive in-car interfaces towards amore economic driving behaviour. Proceedings of the 1st InternationalConference on Automotive User Interfaces and Interactive VehicularApplications, 81–88.

Mitchell, V. W. (1999). Consumer perceived risk: Conceptualisations andmodels. European Journal of Marketing, 33, 163–195.

Moray, N., Inagaki, T., & Itoh, M. (2000). Adaptive automation, trust, andself-confidence in fault management of time-critical tasks. Journal ofExperimental Psychology: Applied, 6, 44.

Muir, B. M. (1987). Trust between humans and machines, and the design ofdecision aids. International Journal of Man–Machine Studies, 27, 527–539.

Norman, D. A. (2002). The design of everyday things. New York, NY: BasicBooks.

Numan, J. H. (1998). Knowledge-based systems as companions: Trust, humancomputer interaction and complex systems. University of Groningen.Doctoral dissertation.

Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2008). Situation awareness,mental workload, and trust in automation: Viable, empirically supportedcognitive engineering constructs. Journal of Cognitive Engineering andDecision Making, 2, 140–160.

Parkes, K. R. (1984). Locus of control, cognitive appraisal, and coping in stress-ful episodes. Journal of Personality and Social Psychology, 46, 655–668.

Pavlou, P. A. (2003). Consumer acceptance of electronic commerce: Integratingtrust and risk with the technology acceptance model. International Journalof Electronic Commerce, 7, 101–134.

Payre, W., Cestac, J., & Delhomme, P. (2014). Intention to use a fully automatedcar: Attitudes and a priori acceptability. Transportation Research Part F:Traffic Psychology and Behaviour, 27, 252–263.

Peter, J. P., & Ryan, M. J. (1976). An investigation of perceived risk at the brandlevel. Journal of Marketing Research, 13, 184–188.

Ratnasingham, P., & Kumar, K. (2000, December). Trading partner trustin electronic commerce participation. Proceedings of the Twenty-FirstInternational Conference on Information Systems, 544–552.

Riley, V. A. (1994). Human use of automation (Unpublished doctoral disserta-tion). Minneapolis, MN: University of Minnesota.

Rotter, J. B. (1966). Generalized expectancies for internal versus external con-trol of reinforcement. Psychological Monographs: General and Applied,80(1), Whole No. 609.

Rudin-Brown, C. M., & Ian Noy, Y. (2002). Investigation of behavioral adapta-tion to lane departure warnings. Transportation Research Record: Journalof the Transportation Research Board, 1803, 30–37.

Rudin-Brown, C. M., & Parker, H. A. (2004). Behavioural adaptation toadaptive cruise control (ACC): Implications for preventive strategies.Transportation Research Part F: Traffic Psychology and Behaviour, 7,59–76.

Schepers, J., & Wetzels, M. (2007). A meta-analysis of the technologyacceptance model: Investigating subjective norm and moderation effects.Information & Management, 44, 90–103.

Sheppard, B. H., Hartwick, J., & Warshaw, P. R. (1988). The theory of rea-soned action: A meta-analysis of past research with recommendations formodifications and future research. Journal of Consumer Research, 15,325–343.

Sheridan, T. B. (1975). Considerations in modeling the human supervisorycontroller. International Federation of Automatic Control, Triennial WorldCongress, 6th, 40.

Sheridan, T. B., & Hennessy, R. T. (1984). Research and modeling of super-visory control behavior. Report of a workshop. Washington DC: NationalResearch Council, Committee on Human Factors.

Söllner, M., Hoffmann, A., Hoffmann, H., & Leimeister, J. M. (2011,December). Towards a theory of explanation and prediction for the forma-tion of trust in IT artifacts. Annual Workshop on HCI Research in MIS (Vol.4).

Stanton, N. A., & Marsden, P. (1996). From fly-by-wire to drive-by-wire: Safetyimplications of automation in vehicles. Safety Science, 24, 35–49.

Stanton, N. A., & Young, M. S. (2005). Driver behaviour with adaptive cruisecontrol. Ergonomics, 48, 1294–1313.

Stutts, J. C. (1998). Do older drivers with visual and cognitive impairments driveless? Journal of the American Geriatrics Society, 46, 854–861.

Taylor, S., & Todd, P. A. (1995). Understanding information technologyusage: A test of competing models. Information Systems Research, 6,144–176.

Thatcher, J. B., McKnight, D., Baker, E. W., Arsal, R. E., & Roberts, N.H. (2011). The role of trust in postadoption it exploration: An empiricalexamination of knowledge management systems. IEEE Transactions onEngineering Management, 58, 56–70.

Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technologyacceptance model: Four longitudinal field studies. Management Science, 46,186–204.

Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User accep-tance of information technology: Toward a unified view. MIS Quarterly, 27,425–478.

Ward, N. J., Fairclough, S., & Humphreys, M. (1995). The effect of taskautomatisation in the automotive context: A field study of an AutonomousIntelligent Cruise Control system. Proceedings of the International

Page 12: Investigating the Importance of Trust on Adopting an ...interaction.yonsei.ac.kr/wp-content/uploads/2017/... · have applied TAM to explain the adoption and utilization of driving

702 J. K. CHOI AND Y. G. JI

Conference on Experimental analysis and Measurement of SituationAwareness, November, 1–3, Daytona Beach, Florida.

West, C. G., Gildengorin, G., Haegerstrom-Portnoy, G., Lott, L. A., Schneck,M. E., & Brabyn, J. A. (2003). Vision and driving self-restriction in olderadults. Journal of the American Geriatrics Society, 51, 1348–1355.

Wu, J. H., Shen, W. S., Lin, L. M., Greenes, R. A., & Bates, D. W. (2008).Testing the technology acceptance model for evaluating healthcare profes-sionals’ intention to use an adverse event reporting system. InternationalJournal for Quality in Health Care, 20, 123–129.

Zuckerman, M. (1994). Behavioral expressions and biosocial bases of sensationseeking. New York, NY: Cambridge University Press.

ABOUT THE AUTHORSJong Kyu Choi is a Ph.D. student at the Department ofInformation and Industrial Engineering, Yonsei University,Seoul, South Korea. His current research interests mainly focuson human-automation interaction.

Yong Gu Ji is a professor at the Department of Informationand Industrial Engineering, Yonsei University, Seoul, SouthKorea. His research covers human-computer interaction andinteraction design.