Abstract
The discussion around artificial empathy and its ethics is not a new one. This concept can be found in classic science fiction media such as Star Trek and Blade Runner and is also pondered on in more recent interactive media such as the video game Detroit: Become Human. In most depictions, emotions and empathy are presented as the key to being human. Misselhorn's new publication shows that these futuristic stories are becoming more and more relevant today. We must ask ourselves whether we are socially responsible enough to deal with the consequences of artificial empathy/awareness. If we create artificial life, we should be prepared to treat them accordingly as living beings with respect and no longer categorize them as objects. The author does not rule out the idea that machines might one day become more human than humans themselves and that we humans might even lose our own specific cognitive, emotional and social abilities.
Data availability statement
All authors contributed to the study conception and design. Material preparation, data collection and analysis were performed by Janina L. Samuel and André Schmiljun. The first draft of the manuscript was written by André Schmiljun and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.
References
Danaher J, Mcarthur N (2018) Robot sex. Social and ethical implications. MIT Press, Massachusetts
Eyssel F, Reich N (2013) Loneliness makes the heart grow fonder (of robots)—On the effects of loneliness on psychological anthropomorphism. In: 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), IEEE, pp 121–122). https://doi.org/10.1109/HRI.2013.6483531
Gallagher HL, Frith CD (2003) Functional imaging of ‘theory of mind.’ Trends Cogn Sci 7(2):77–83
Heyes CM (1998) Theory of mind in nonhuman primates. Behav Brain Sci 21(1):101–114
Howick J, Morley J, Floridi L (2021) An empathy imitation game: empathy turing test for care‑ and chat‑bots. Available at SSRN: https://ssrn.com/abstract=3826418 or https://doi.org/10.2139/ssrn.3826418.
Karpus J, Krüger A, Verba TJ, Bahrami B, Deroy O (2021) Algorithm exploitation: humans are keen to exploit benevolent AI. iScience. https://doi.org/10.1016/j.isci.2021.102679
Loh J (2019) Roboterethik. Suhrkamp, Frankfurt/Main
Misselhorn C (2009) Empathy with inanimate objects and the uncanny valley. Mind Mach 19:345–359
Misselhorn C (2013) Robots as moral agents. In: Frank Rövekamp und Friederike Bosse (eds) Ethics in science and society: German and Japanese views. München, pp 30–42. ISBN 978-3-86205-075-8
Misselhorn C (ed) (2015) Collective agency and cooperation in natural and artificial systems—explanation, implementation and simulation. In: Philosophical Studies Series 122. https://doi.org/10.1007/978-3-319-15515-9
Misselhorn C (2018) Grundfragen der Maschinenethik. Reclam, Ditzingen
Misselhorn C (2019) A software module for an ethical elder care robot. Design and implementation. Ethics Prog. https://doi.org/10.14746/eip.2019.2.7
Montemayor C, Halpern J, Fairweather A (2021) In principle obstacles for empathic AI: why we can’t replace human empathy in healthcare. AI & Soc. https://doi.org/10.1007/s00146-021-01230-z
Musiał M (2019) Loving dolls and robots: from freedom to objectification, from solipsism to autism? In: Grider JT, van Reenen D (eds) The Inescapable Entanglement of Tradition, Transcendence and Transgression, Brill | Rodopi: Leiden, Bosten, pp 152–168. https://doi.org/10.1163/9789004382299_010
Nowak E (2020) Advancing the human self. Do technologies make us “Posthuman”? Dia-Logos, Berlin
Palermo R, O’Connor KB, Davis JM, Irons J, McKone E (2013) New tests to measure individual differences in matching and labelling facial expressions of emotion, and their association with ability to recognise vocal emotions and facial identity. PLoS ONE 8(6):e68126
Pashevich E (2021) Can communication with social robots influence how children develop empathy? Best-evidence synthesis. AI & Soc. https://doi.org/10.1007/s00146-021-01214-z
Pelau C, Dabija D-C, Ene I (2021) What makes an AI device human-like? The role of interaction quality, empathy and perceived psychological anthropomorphic characteristics in the acceptance of artificial intelligence in the service industry. Comput Hum Behav. https://doi.org/10.1016/j.chb.2021.106855
Preckel K, Kanske P, Singer T (2018) On the interaction of social affect and cognition: empathy, compassion and theory of mind. Curr Opin Behav Sci 19:1–6. https://doi.org/10.1016/j.cobeha.2017.07.010
Warwick K, Shah H (2016) Turing’s imitation game. conversations with the unknown. University Press, Cambridge
Woodruff G, Premack D (1978) Does the chimpanzee have a theory of mind. Behav Brain Sci 4(1):515–526
Yan X, Andrews TJ, Young AW (2016) Cultural similarities and differences in perceiving and recognizing facial expressions of basic emotions. J Exp Psychol Hum Percept Perform 42(3):423
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
On behalf of all authors, the corresponding author states that there is no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Samuel, J.L., Schmiljun, A. What dangers lurk in the development of emotionally competent artificial intelligence, especially regarding the trend towards sex robots? A review of Catrin Misselhorn’s most recent book. AI & Soc 38, 2717–2721 (2023). https://doi.org/10.1007/s00146-021-01261-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00146-021-01261-6