Abstract
This work is devoted to the development of a concept-proof prototype of a special kind of a virtual actor - an intelligent assistant and a partner, called here “Virtual Listener” (VL). The main role of VL is to establish and maintain a socially-emotional contact with the participant, thereby providing a feedback to the human performance, using minimal resources, such as body language and mimics. This sort of a personal assistant is intended for a broad spectrum of application paradigms, from assistance in preparation of lectures to creation of art and design, insight problem solving, and more, and is virtually extendable to assistance in any professional job performance. The key new element is the interface based on facial expressions. The concept is implemented and tested in limited prototypes. Implications for future human-level artificial intelligence are discussed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Tatar K, Pasquier P (2018) Musical agents: a typology and state of the art towards Musical Metacreation. J New Music Res 48:56–105
Alemi O, Françoise J, Pasquier P (2017) GrooveNet: real-time music-driven dance movement generation using artificial neural networks. In: 23rd ACM SIGKDD conference on knowledge discovery and data mining workshop on machine learning for creativity, Halifax, Nova Scotia, Canada
Reis HT, Sheldon KM, Gable SL, Roscoe J, Ryan RM (2000) Daily well-being: the role of autonomy, competence, and relatedness. Pers Soc Psychol Bull 26(4):419–435. https://doi.org/10.1177/0146167200266002
Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178
Osgood CE, Suci G, Tannenbaum P (1957) The measurement of meaning. University of Illinois Press, Urbana
Simpson JA (2007) Psychological foundations of trust. Curr Dir Psychol Sci 16(5):264–268
Samsonovich AV (2013) Emotional biologically inspired cognitive architecture. Biol Inspired Cogn Arch 6:109–125. https://doi.org/10.1016/j.bica.2013.07.009
Samsonovich AV (2018) Schema formalism for the common model of cognition. Biol Inspired Cogn Arch 26:1–19. https://doi.org/10.1016/j.bica.2018.10.008
Tian YI, Kanade T, Cohn JF (2001) Recognizing action units for facial expression analysis. IEEE Trans Pattern Anal Mach Intell 23(2):97–115
Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780
Acknowledgments
The authors are grateful to Drs. Dolenko S.A., Ushakov V.L., Redko V.G., Klimov V.V., Ms. Tikhomirova D.V., Mr. Polstyankin K.V., and NRNU MEPhI students who participated in pilot experiments and contributed to many discussions of the concept. One of the authors (Alexei Samsonovich) is grateful to Ms. Kostkina A.D. for the inception of the idea of a Virtual Listener. This work was supported by the Russian Science Foundation (RSF) Grant # 18-11-00336.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Eidlin, A.A., Chubarov, A.A., Samsonovich, A.V. (2020). Virtual Listener: Emotionally-Intelligent Assistant Based on a Cognitive Architecture. In: Samsonovich, A. (eds) Biologically Inspired Cognitive Architectures 2019. BICA 2019. Advances in Intelligent Systems and Computing, vol 948. Springer, Cham. https://doi.org/10.1007/978-3-030-25719-4_10
Download citation
DOI: https://doi.org/10.1007/978-3-030-25719-4_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-25718-7
Online ISBN: 978-3-030-25719-4
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)