Trust as a second-order construct: Investigating the relationship between consumers and virtual agents
Introduction
In the digital era, artificial intelligence (AI)-powered virtual agents (VAs) have become prevalent in the retail industry. VAs refer to AI software applications and automatic technologies that are digitally generated using computer algorithms (Fernandes and Oliveira, 2021, Li, 2015). Different from social robots that are physically embodied, VAs are disembodied with merely virtual representations (Lee, Jung, Kim, & Kim, 2006). In addition to the common AI-powered personal voice assistants such as Alexa or Siri, other AI technologies embedded in mobile devices, social messaging platforms, or other digital channels (e.g., automated phone system, chatbot, and personalized subscription boxes, etc.) are considered as different forms of VAs (Guzman, 2019). A report indicated that more than 75% of companies have employed AI applications for business (Forbes, 2019). By 2024, VAs such as Amazon’s Alexa, Apple’s Siri, Google’s Assistant, and Microsoft’s Cortana are expected to share a $12 billion market (Baron, 2017).
VAs are expected to play a critical role in customer service, as this type of support technology is trained to perform the tasks that are normally carried out by employees (Kumar, Dixit, Javalgi, & Dass, 2016). Accordingly, service interactions between consumers and employees are mostly expected to be transformed into consumer-AI technology interactions by 2025 (Forbes, 2018). Despite the growing use in VAs in the retail industry, evidence suggests that consumers still experience resistance to such technologies (Fernandes & Oliveira, 2021). Fernandes and Oliveira (2021) reported that 86% of consumers prefer to communicate with a human employee to address service issues. The dilemma of rapidly evolving AI technology and slow consumer adoption may hinder the process of successfully applying VAs to the retail and service contexts. Indeed, consumers tend to be innately skeptical and have a general aversion towards algorithms especially when the algorithms make mistakes (Dietvorst et al., 2014, Schmidt et al., 2020). After experiencing algorithmic errors caused by AI, consumers quickly lost confidence in the technology and were likely to choose a human employee who performed the same tasks over AI (Dietvorst et al., 2014). Furthermore, the lack of physical presence likely leads to lower trust for VAs than service robots (Salem, Lakatos, Amirabdollahian, & Dautenhahn, 2015). The innate skepticism has caused barriers for consumers in adopting VAs and building relationships with VAs.
Previous research has employed the Technology Acceptance Model (TAM) to better understand the adoption of AI technology in the context of service robots (i.e., the Service Robot Acceptance Model, sRAM) (Wirtz et al., 2018). Using this context-specific framework, Lu et al. (2020) found that relational variables such as trust (consumers’ beliefs in the reliability of service robots) and rapport (connections between consumers and service robots) are underexplored drivers of humans’ acceptance of service robots. Given that trust serves as an essential and relational motivation to adopt innovative technologies and to form relationships with those technologies (Fernandes & Oliveira, 2021), it is necessary to explore consumer trust in the settings of VAs. Therefore, the specific purposes of this study are twofold: (a) to investigate the dimensionality of trust in a VA context; and (b) to explore the drivers and antecedents of consumer-VA trust.
The way trust has been conceptualized varies to a great extent depending on contexts. The concept of trust is dynamic and changes according to the behavior of the trusted agent such as the type of technologies to be trusted (Crisp & Jarvenpaa, 2013). In particular, the conceptualization of trust in the consumer-AI relationship depends on its representation (e.g., virtual/disembodied, physical) and the level of intelligent capabilities (Glikson & Woolley, 2020). VAs are characterized by their disembodied representation as well as their unique interactive capabilities (Glikson & Woolley, 2020). Therefore, it is logical to postulate that the way consumer trust in VAs unfolds differently from trust building with other technologies.
This research proposes consumer-VA trust as a second-order construct consisting of three dimensions as first-order reflective constructs (competence in VA, integrity in VA, and self-efficacy of consumer) that function together to contribute to the trust construct. Drawn on the extended TAM, the proposed research framework of consumer-VA trust postulates trust as a second-order construct that positively influences consumers’ perceptions (i.e., perceived usefulness and perceived enjoyment) and subsequently their intentions to continue use of VAs. To the best of our knowledge, this research is the first empirical study to explore consumer trust in VA as a second-order construct manifested by three unique first-order factors. In order to demonstrate the robustness of our second-order model, we develop two rival models. Based on the comparisons between the research model and the two rival models, the proposed framework is further validated with strong measurement coefficients.
Section snippets
Literature review
According to the “computers as social actors” paradigm (CASA), when consumers interact with technologies such as computers and other new media, the interaction process tends to be essentially social (Nass, Moon, Fogg, Reeves, & Dryer, 1995). During the interactions, humans not only expect computers to follow social norms, but also treat them as humans, by assigning human-like personality and characteristics to the computers (Edwards et al., 2019, Nass et al., 1995). Extant literature on
Research design and procedures
A Qualtrics web-based survey was developed and a link to the online questionnaire was distributed via email to students at a United States university. Participants enrolled in large undergraduate courses were invited to participate in this study in exchange for extra credit. The survey method was utilized to (1) explore consumer-VA trust as a second-order construct with three first-order reflective constructs (i.e., competence, integrity, and self-efficacy) and (2) investigate the impacts of
Participant characteristics
A total of 258 respondents were recruited and 192 useable responses were collected for data analyses. The majority of participants were female (89.6%) and 77.1% were white. Most of them were 19–20 years old (68.7%) and in the freshman or sophomore year (79.2%). Student samples were collected in this study, as millennials and Generation Zs are considered the most prolific users of virtual assistants (Statista.com, 2017). Researchers have demonstrated that student samples are appropriate based on
Discussion
The current research proposed and empirically tested a robust framework of consumer-VA trust. In spite of the growing interest in AI-powered agents, consumer trust in VAs has been underexplored. This is the first study to offer a comprehensive view of trust by examining this construct as a second-order construct so as to capture relationship building between consumers and VAs. In addition, this study investigates the role of consumer-VA trust in consumer perceptions, which further lead to
Limitations and future research
In spite of the significant results of this study, there are some limitations that invite future research. Data were collected using a convenience sample and this study focused on specific cohorts such as millennials and Generation Zs. College students are reported to be more homogenous (Peterson, 2001), thus reducing the heterogeneity of the sample, so further investigation is needed to apply the results to the general public. To increase the generalizability of the results, future research
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
References (104)
- et al.
The impact of the online and offline features on the user acceptance of internet shopping malls
Electron. Commer. Res. Appl.
(2004) The theory of planned behavior
Organ. Behav. Hum. Decis. Process.
(1991)- et al.
I, chatbot: Modeling the determinants of users’ satisfaction and continuance intention of AI-powered service agents
Telematics Inform.
(2020) - et al.
How shall I trust the faceless and the intangible? A literature review on the antecedents of online trust
Comput. Hum. Behav.
(2010) - et al.
Should AI-based, conversational digital assistants employ social- or task-oriented interaction style? A task-competency and reciprocity perspective for older adults
Comput. Hum. Behav.
(2019) - et al.
Autonomous shopping systems: Identifying and overcoming barriers to consumer adoption
J. Retail.
(2020) - et al.
Evaluations of an artificial intelligence instructor’s voice: Social identity theory in human-robot interactions
Comput. Hum. Behav.
(2019) - et al.
Understanding consumers’ acceptance of automated technologies in service encounters: Drivers of digital voice assistants adoption
J. Busin. Res.
(2021) Voices in and of the machine: Source orientation toward mobile virtual assistants
Comput. Hum. Behav.
(2019)- et al.
Consumer e-shopping acceptance: Antecedents in a technology acceptance model
J. Busin. Res.
(2009)
How vloggers embrace their viewers: Focusing on the roles of para-social interactions and flow experience
Telemat. Inform.
Internet self-efficacy and electronic service acceptance
Decis. Support Syst.
Achieving customer value from electronic channels through identity commitment, calculative commitment, and trust in technology
J. Interact. Market.
Cognitive and affective trust in service relationships
J. Busin. Res.
Self-efficacy and acceptance of robots
Comput. Hum. Behav.
Are physically embodied social agents better than disembodied social agents?: The effects of physical embodiment, tactile interaction, and people’s loneliness in human-robot interaction
Int. J. Hum Comput Stud.
The adoption of virtual reality devices: The technology acceptance model integrating enjoyment, social interaction, and strength of the social ties
Telematics Inform.
The virtual reality hardware acceptance model (VR-HAM): Extending and individuating the technology acceptance model (TAM) for virtual reality hardware
J. Busin. Res.
Can computer personalities be human personalities?
Int. J. Hum Comput Stud.
Information quality, trust, and risk perceptions in electronic data exchanges
Decis. Support Syst.
Validation of haptic enabling technology acceptance model (HE-TAM): Integration of IDT and TAM
Telemat. Inform.
Modeling and testing consumer trust dimensions in e-commerce
Comput. Hum. Behav.
Shopping intention at AI-powered automated retail stores (AIPARS)
J. Retail. Consum. Serv.
Artificial intelligence and the new forms of interaction: Who has the control when interacting with a chatbot?
J. Busin. Res.
Feeling our way to machine minds: People’s emotions when perceiving mind in artificial intelligence
Comput. Hum. Behav.
How perceived trust mediates merchant’s intention to use a mobile wallet technology
J. Retail. Consum. Services
Technology acceptance theories and factors influencing artificial intelligence-based intelligent products
Telematics Inform.
Point of adoption and beyond. Initial trust and mobile-payment continuation intention
J. Retail. Consum. Serv.
An overview of online trust: Concepts, elements, and implications
Comput. Hum. Behav.
Effects of rational and social appeals of online recommendation agents on cognition- and affect-based trust
Decis. Support Syst.
Self-efficacy, trust, and perceived benefits in the co-creation of value by consumers
Int. J. Retail Distribut. Manage.
Consumer choice and autonomy in the age of artificial intelligence and big data
Custom. Need. Solut.
Usefulness, enjoyment, and self-image congruence: The adoption of e-Book readers
Psychol. Market.
Consumer perspectives on standardization in international advertising: A student sample
J. Advertis. Res.
Role of cognitive absorption in building user trust and experience
Psychol. Market.
Self-efficacy: Toward a unifying theory of behavioral change
Psychol. Rev.
Understanding anthropomorphism in service provision: A meta-analysis of physical robots, chatbots, and other AI
J. Acad. Mark. Sci.
Interpreting dimensions of consumer trust in e-commerce
Inf. Technol. Manage.
Competence- and integrity-based trust in interorganizational relationships: Which matters more?
J. Manage.
Swift trust in global virtual teams: Trusting beliefs and normative actions
J. Personn. Psychol.
Perceived usefulness, perceived ease of use, and user acceptance of information technology
MIS Quarterly
Frontline service technology infusion: Conceptual archetypes and future research directions
J. Serv. Manage.
Understanding algorithm aversion: Forecasters erroneously avoid algorithms after seeing them err
Acad. Manage. Proceed.
An empirical investigation into the relationship between computer self-efficacy, anxiety, experience, support and usage
J. Comput. Informat. Syst.
Evaluating structural equation models with unobservable variables and measurement error
J. Mark. Res.
A unified perspective on the factors influencing consumer acceptance of internet of things technology
Asia Pac. J. Market. Logist.
Cited by (13)
What shapes a parasocial relationship in RVGs? The effects of avatar images, avatar identification, and romantic jealousy among potential, casual, and core players
2023, Computers in Human BehaviorCitation Excerpt :In recent years, PSR has been applied to the digital area. For example, consumers would generate trust towards virtual agents, which enhances the relationship between them and in turn, encourages their willingness of continuous use (Huang, Kim, & Lennon, 2022). In video games, as players interact with their own avatars, they could also form PSR with the avatar (Hartmann, 2008; Klimmt, Hartmann, & Schramm, 2006; Song & Fox, 2016).
Impact of Live Chat Service Quality on Behavioral Intentions and Relationship Quality: A Meta-Analysis
2024, International Journal of Human-Computer Interaction