Abstract
This paper investigates the impact of time of day on truthfulness in human-agent interactions. Time of day has been found to have important implications for moral behavior in human-human interaction. Namely, the morning morality effect shows that people are more likely to act ethically (i.e., tell fewer lies) in the morning than in the afternoon. Based on previous work on disclosure and virtual agents, we propose that this effect will not bear out in human-agent interactions. Preliminary evaluation shows that individuals who lie when engaged in multi-issue bargaining tasks with the Conflict Resolution Agent, a semi-automated virtual human, tell more lies to human negotiation partners than virtual agent negotiation partners in the afternoon and are more likely to tell more lies in the afternoon than in the morning when they believe they are negotiating with a human. Time of day does not have a significant effect on the amount of lies told to the virtual agent during the multi-issue bargaining task.
Preview
Unable to display preview. Download preview PDF.
References
Gajadhar, B.J., de Kort, Y.A.W., IJsselsteijn, W.A.: Shared fun is doubled fun: player enjoyment as a function of social setting. In: Markopoulos, P., de Ruyter, B., IJsselsteijn, W., Rowland, D. (eds.) Fun and Games 2008. LNCS, vol. 5294, pp. 106–117. Springer, Heidelberg (2008). doi:10.1007/978-3-540-88322-7_11
Gratch, J., DeVault, D., Lucas, G.: The benefits of virtual humans for teaching negotiation. In: Traum, D., Swartout, W., Khooshabeh, P., Kopp, S., Scherer, S., Leuski, A. (eds.) IVA 2016. LNCS, vol. 10011, pp. 283–294. Springer, Cham (2016). doi:10.1007/978-3-319-47665-0_25
Kouchaki, M., Smith, I.H.: The morning morality effect: The influence of time of day on unethical behavior. Psychological Science 25(1), 95–102 (2014)
Krämer, N.C., von der Pütten, A., Eimler, S.: Human-agent and human-robot interaction theory: similarities to and differences from human-human interaction. In: Zacarias, M., de Oliveira, J.V. (eds.) Human-Computer Interaction: The Agency Perspective. SCI, vol. 396, pp. 215–240. Springer, Heidelberg (2012). doi:10.1007/978-3-642-25691-2_9
Lucas, G.M., Gratch, J., King, A., Morency, L.P.: It’s only a computer: virtual humans increase willingness to disclose. Computers in Human Behavior 37, 94–100 (2014)
de Melo, C.M., Gratch, J.: Beyond believability: quantifying the differences between real and virtual humans. In: Brinkman, W.-P., Broekens, J., Heylen, D. (eds.) IVA 2015. LNCS, vol. 9238, pp. 109–118. Springer, Cham (2015). doi:10.1007/978-3-319-21996-7_11
Ravaja, N.: The psychophysiology of digital gaming: The effect of a non co-located opponent. Media Psychology 12(3), 268–294 (2009)
Reeves, B., Nass., C.: The media equation: How people treat computers, television, and new media like real people and places. Cambridge University Press (1996)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Mozgai, S., Lucas, G., Gratch, J. (2017). To Tell the Truth: Virtual Agents and Morning Morality. In: Beskow, J., Peters, C., Castellano, G., O'Sullivan, C., Leite, I., Kopp, S. (eds) Intelligent Virtual Agents. IVA 2017. Lecture Notes in Computer Science(), vol 10498. Springer, Cham. https://doi.org/10.1007/978-3-319-67401-8_37
Download citation
DOI: https://doi.org/10.1007/978-3-319-67401-8_37
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-67400-1
Online ISBN: 978-3-319-67401-8
eBook Packages: Computer ScienceComputer Science (R0)