Deception in cyberspace: A comparison of text-only vs. avatar-supported medium

https://doi.org/10.1016/j.ijhcs.2007.04.005Get rights and content

Abstract

The use of anthropomorphic avatars provides Internet users the opportunity and freedom to manipulate their identity. As cyberspace becomes a haven for deceptive behavior, human–computer interaction research will need to be carried out to study and understand these deceptive behaviors. The objective of this research is to investigate the behavior of deceivers and non-deceivers (or truth-tellers) in the cyberspace environment. We examine if the intention to deceive others influences one's choice of avatars in the online chat environment. We also investigate if communication medium (text-only vs. avatar-supported chat) influences one's perception of trustworthiness of the communication partner. A lab experiment was conducted in an online chat environment with dyads. The results indicate that in the text-only chat environment, subjects who were deceiving their partner experienced higher anxiety levels than those who were truthful to their partner; however, the same phenomenon was not observed in the avatar-supported chat environment. This suggests that “wearing a mask” in cyberspace may reduce anxiety in deceiving others. Additionally, deceivers are more likely to choose avatars that are different from their real selves. The results also show that the use of avatars in a computer-mediated chat environment does not have an impact on one's perceived trustworthiness.

Introduction

… Han Sang, a 14-year-old boy in Seoul, stole $35 from his parents to buy sunglasses and other accessories. The petty thievery was bad enough, but what really irked his dad, Kim Sung Bae, was that none of the stuff he bought was real. They were for the animated character, or avatar, the boy used as a stand-in for himself on the Internet. Han was spending four hours each night hanging out online with his friends and wanted his virtual stand-in to look as cool as possible…

Forbes Magazine 7/21/2003

… In real life, Rebecca Nesson—slight, 30, short hair—is a lawyer working on a doctorate in computer science who teaches at Harvard Extension School. Her Second Life avatar, or cartoon self, is Rebecca Berkman, who looks like Nesson and who, standing in a virtual amphitheater, leads avatars from as far away as Korea and Houston in a discussion section of the Extension School course CyberOne…

The Boston Globe 10/25/2006

The use of humanoid/anthropomorphic avatars provides new ways for people to interact. An avatar is a virtual representation of oneself that other users can see or interact with in a virtual environment. Avatars were first used in computer gaming and online chat rooms. The use of avatars on the Internet has become very popular, such as in Korea and the United States. More than 3.6 million Koreans possess their own individualized avatars, or virtual selves, that are used to represent them in chat rooms and e-mails (Fulford, 2003). The use of avatars has also extended beyond entertainment, including education, virtual meeting rooms, sales activities, fashion, trade shows, and government.

Avatars offer a new form of visual representation and a different type of anonymity on the Internet. The use of avatars may help to overcome problems associated with the lack of social cues in computer-mediated communication (CMC). Having information about the identity of those with whom you communicate is essential for understanding and evaluating an interaction (Donath, 1998), but in CMC, many of these cues are eliminated or attenuated (Carlson and George, 2004). Interpersonal cues in communication are also important for determining the truthfulness of the message delivered (Carlson et al., 2004). To overcome problems with lean CMC media, avatars are commonly used in chat rooms or instant messaging (IM) environments to provide visual and social cues (Walther, 1992; Scheidt, 2004).

This paper examines the use of avatars in IM in the context of deceptive communication. According to the literature, deception is a part of our everyday personal life (Zhou et al., 2003; Hancock et al., 2004) and business life (Zuckerman et al., 1981; Cialdini et al., 2004). People deceive about all kinds of issues “from the mundane, such as opinions about appearance, to the essential, such as courtroom testimony or military interrogations” (Carlson et al., 2004). The number of people being victimized by deceptive practices over the Internet (e.g., frauds, misleading advertisements) is also on the rise (Grazioli and Wang, 2001; Internet Fraud Watch, 2006).

Similarly, deception is common in CMC (Walther and Tidwell, 1996; Bowker and Tuffin, 2003). Relational processes in CMC can be exaggerated because CMC offers more opportunities for managing self-presentation (Carlson et al., 2004). The anonymity provided by CMC may foster deception (Bowker and Tuffin, 2003), as the deceiver is able to disassociate himself/herself from the deceiving message. This is especially true on the Internet, where high anonymity is possible and where it is difficult to assess identity and accountability regarding deception. For example, the number of Internet frauds, on average, grew more than 250% annually (Grazioli and Jarvenpaa, 2003). The Internet Crime Complaint Center (IC3) (2005), which was created by the Federal Bureau of Investigation and the National White Collar Crime Center, has also received increased numbers of complaint submissions over the years.

As stated by Suler (1999), “the beauty (and misfortune) of the Internet is that it offers the opportunity for users to experiment with their identity”. Users are limited only by technology and imagination in presenting their virtual selves. With the anonymity provided by the Internet, users may take on a false identity or engage in deceptive communication without being detected. Detecting deception is generally difficult. Studies have shown that lie detection rates are between 45% and 65% in face-to-face settings (Kraut, 1980; Zuckerman et al., 1981; Kalbfleisch, 1992). This means that the probability of detecting deception is only about 50/50. This probability is even lower in CMC. Consequently, as cyberspace becomes a haven for deceptive behaviors, it is important to study these deceptive behaviors. A better understanding of deception in CMC can help to educate Internet users in online deception and more importantly, to design and foster an online communication environment that promotes truthfulness and inhibits deception.

As companies are increasingly using IM and avatars, it is important to study and understand deception in avatar-supported online chat environments. IM is increasingly used in organizations to support internal and external communication. Intel, for example, is taking advantage of the IM technology to facilitate communication and coordination among its geographically distributed teams that work in different regions of the world (Intel Corporation, 2004). Other companies are using IM to communicate with customers to provide a variety of customer services including answering questions, resolving problems, making clarifications, and giving advice. For example, one of the factors identified for Lands’ End's success in their e-commerce initiative is the use of online IM (Dukcevich, 2002). As IM becomes increasingly popular in organizations, we need a better understanding of cues and behavior associated with deception in the online IM environment.

Section snippets

Literature review

The relationship of online vs. off-line identity is a central theme in research on communication in cyberspace. Identity construction in cyberspace is directly related to the nature of interactions and opportunities offered by the environment (Talamo and Ligorio, 2001). Social interaction on the Internet has some unique characteristics, such as anonymity and control (i.e., the user can log on and off, write and rewrite, etc.) (McKenna and Bargh, 1998). Thus, the online environment provides

Theoretical background and hypotheses

Deception generally involves messages and information knowingly transmitted to create a false or misleading conclusion (Zhou and Zhang, 2004). Deception demands mental effort, which includes manipulating information, strategically controlling behavior, and image management (Buller and Burgoon, 1996). When a person is not honest in his/her self-presentation, he/she will not be relaxed or comfortable and will likely be nervous from engaging in highly absorbing self-monitoring (Mabry, 2002). In

Methodology

The study was carried out using a 2×2 experimental design. The two factors in the experiment are: (a) truthfulness/deception condition: deception vs. truth condition; (b) communication mode: text-only vs. avatar-supported online chat. This study was carried out in an experimental setting in order to manipulate deception.

The subjects were randomly assigned to dyads, and each dyad was randomly assigned to one of the four experimental conditions: (i) text-only chat with no deception; (ii)

Data analysis

A total of 94 business undergraduates (26% females, 74% males) participated in the experiment. The number of subjects in each cell is shown in Table 1. The age of the subjects ranged from 19 to 40, with an average age of 23 years old. Most of the participants (71%) did not have previous experience with chat rooms; hence, a training session was provided to familiarize them with online chat. These demographic variables will be used as controls (i.e., covariates) in our data analysis.

The Cronbach

Discussion of results

The results from this study show that in the text-only medium, users who are deceiving others online will experience higher state anxiety levels than users who are truthful in the communication. However, in the avatar-supported chat environment, such differences do not exist. Although deceivers in text-only chat exhibit higher state anxiety levels than truth-tellers in text-only chat, the heightened state anxiety levels of deceivers diminish in the avatar-supported chat environment. Thus, the

Conclusions, limitations, and future research

With the increased use of CMC in both social and business settings, it is very important to study online deceptive behavior. While additional research is needed to further understand other issues related to deception and the use of avatars, our study shows that deceivers exhibit higher state anxiety levels than truth-tellers in text-only chat but not in avatar-supported chat. It is also found that those who are deceiving are more likely to choose avatars that are different from their real

References (86)

  • J. Siegel et al.

    Group processes in computer-mediated communication

    Organizational Behavior and Human Decision Processes

    (1986)
  • M. Zuckerman et al.

    Verbal and nonverbal communication of deception

  • M. Argyle

    Bodily communication

    (1988)
  • M.J. Baker et al.

    The impact of physically attractive models on advertising evaluations

    Journal of Marketing Research

    (1977)
  • Benoit, W.L., 2006. Persuasion: the Yale approach. Retrieved May 3, 2006, from...
  • S. Booth-Butterfield et al.

    Communication anxiety inventory

  • Bos, N., Olson, J. S., Gergle, D., Olson, G. M., Wright, Z., 2002. Effects of four computer-mediated communications...
  • Bowker, N.I., 2001. Understanding online communities through multiple methodologies combined under a postmodern...
  • N. Bowker et al.

    Dicing with deception: People with disabilities’ strategies for managing safety and identity online

    Journal of Computer-Mediated Communication

    (2003)
  • P. Bull

    Body movement and interpersonal communication

    (1983)
  • P. Bull et al.

    The social psychology of facial appearance

    (1988)
  • D.B. Buller et al.

    Interpersonal deception theory

    Communication Theory

    (1996)
  • D.B. Buller et al.

    Interpersonal deception: I. Deceivers’ reactions to receivers’ suspicions and probing

    Communication Monographs

    (1991)
  • D.B. Buller et al.

    Testing interpersonal deception theory: the language of interpersonal deception

    Communication Theory

    (1996)
  • J. Burgoon

    Nonverbal signals

  • J.K. Burgoon et al.

    Interpersonal deception: XII. Information management dimensions underlying deceptive and truthful messages

    Communication Monographs

    (1996)
  • J.K. Burgoon et al.

    Toward computer-aided support for the detection of deception

    Group Decision and Negotiation

    (2004)
  • Burgoon, J.K., Stoner, G.M., Bonito, J.A., Dunbar, N.E., 2003. Trust and deception in mediated communication....
  • J.R. Carlson et al.

    Media appropriateness in the conduct and discovery of deceptive communication: the relative influence of richness and synchronicity

    Group Decision and Negotiation

    (2004)
  • J.R. Carlson et al.

    Deception in computer-mediated communication

    Group Decision and Negotiation

    (2004)
  • S. Chaiken

    Physical appearance and social influence

  • R.B. Cialdini et al.

    The hidden costs of organizational dishonesty

    Leadership and Organizational Studies

    (2004)
  • M.J. Culnan et al.

    Information technologies

  • R. Daft et al.

    Organizational information, media richness and structural design

    Management Science

    (1986)
  • Danet, B., Ruedenberg-Wright, L., Rosenbaum-Tamari, J., 1997. “HMMM…WHERE’S THAT SMOKE COMING FROM?”—Writing, play and...
  • J. Dibbel

    Meet your next customer

    Business

    (2003)
  • J. Donath

    Identity and deception in the virtual community

  • V.J. Dubrovsky et al.

    The equalization phenomenon: status effects in computer-mediated and face-to-face decision-making groups

    Human-Computer Interaction

    (1991)
  • D. Dukcevich

    Instant messaging. Lands’ End's instant business

    Forbes Magazine

    (2002)
  • Egger, F.N., 2003. From Interactions to transactions: designing the trust experience for business-to-consumer...
  • P. Ekman

    Why don’t we catch liars?

    Social Research

    (1996)
  • P. Ekman et al.

    Relative importance of face, body, and speech in judgments of personality and affect

    Journal of Personality and Social Psychology

    (1980)
  • B. Fulford

    Korea's weird wired world

    Forbes Magazine

    (2003)
  • W.L. Gardner et al.

    Impression management in organizations

    Journal of Management

    (1988)
  • Golder, S.A., Donath, J., 2004. Hiding and revealing in online poker games. Proceedings of the Computer Supported...
  • S. Grazioli et al.

    Deceived—under target online

    Communications of the ACM

    (2003)
  • Grazioli, S., Wang, A., 2001. Looking without seeing: Understanding naïve consumers’ success and failure to detect...
  • J.L. Hale et al.

    Nonverbal primacy in deception detection

    Communication Reports

    (1990)
  • K. Hall

    Cyberfeminism

  • Hancock, J.T., Thom-Santelli, J., Ritchie, T., 2004. Deception and design: the impact of communication technologies on...
  • D. Handelman

    Play and ritual: complementary frames of meta-communication

  • Heckman, C.E., Wobbrock, J.O., 2000. Put Your Best Face Forward: Anthropomorphic agents, e-commerce consumers, and the...
  • Herring, S., 1994. Gender differences in computer-mediated communication: bringing familiar baggage to the new...
  • Cited by (77)

    • On the Internet you can be anyone: An experiment on strategic avatar choice in online marketplaces

      2023, Journal of Economic Behavior and Organization
      Citation Excerpt :

      Based on observational data, interviews and survey responses from users of avatars in virtual worlds, Martey and Consalvo (2011) and Vasalou and Joinson (2009) conclude that many users choose their avatar not just as an accurate representation of themselves, but also condition their choice of avatar on their anticipated interaction partner and the type of interaction. Galanxhi and Nah (2007) let participants choose an avatar to represent themselves in an interaction where they were or were not instructed to deceive others, and find that subjects instructed to deceive chose avatars that looked more different than themselves. Tingley (2014) pre-generated avatar faces along the dimensions of trustworthiness, dominance, and threat, and finds that trustees in a trust game were more likely to choose avatar faces for themselves that rated higher on the trustworthiness dimension, and senders trusted more in the condition where the avatars were chosen rather than randomly assigned.

    • Offline personality and avatar customisation. Discrepancy profiles and avatar identification in a sample of MMORPG players

      2017, Computers in Human Behavior
      Citation Excerpt :

      CMC scholars have developed different epistemological models of what an avatar is and what sorts of relationship between avatar and player are possible (Evans, 2012). Avatars have been viewed from many different perspectives: as a doll to be dressed up and played with (Liao, 2011), as a product or a tool (Cui, Aghajan, Lacroix, Halteren, & Aghajan, 2009; Galanxhi & Nah, 2007; Loker, Ashdown, & Schoenfelder, 2005), as a prosthetic limb, separate, yet part of the player (Veerapen, 2011), as an entity to which he or she feels psychologically connected (Bessière, Seay, & Kiesler, 2007), as an externalisation of the user's self (Turkle, 1997), as an extension of the self (Gee, 2003; Reid, 1994) that helps the user to explore different facets of his or her identity (Webb, 2001), as ‘another self’ that simulates the characteristics of a person (Bailenson & Yee, 2005; Balsamo, 2000; Jordan, 1999; Kafai, Fields, & Cook, 2007; Kang & Yang, 2006; Yee, Bailenson, Urbanek, Chang, & Merget, 2007), and as an element that encourages the player to behave in unexpected ways (Evans, 2011; Taylor, 2002; Yee, Bailenson, & Ducheneaut, 2009). These conceptualisations of the avatar suggest that there are two kinds of relationship between the avatar and the player (Castronova, 2003), one in which the avatar is a virtual embodiment or a digital projection of the player's offline self and one in which the avatar is a mere artefact.

    • A valued agent: How ECAs affect website customers' satisfaction and behaviors

      2015, Journal of Retailing and Consumer Services
      Citation Excerpt :

      With regard to ECAs in particular, two alternative terms are common. First, “avatar” derives from Sanskrit and refers to the embodiment of a deity on earth (Galanxhi and Nah, 2007). Holzwarth et al. (2006, p. 20) define the modern versions of avatars as “general graphic representations that are personified by means of computer technology”.

    • Lost in a Gallup: Polling Failure in U.S. Presidential Elections

      2024, Lost in a Gallup: Polling Failure in U.S. Presidential Elections
    • Human-AI Interaction and AI Avatars

      2023, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    View all citing articles on Scopus
    View full text