Abstract
A humanoid robot NAO was introduced as a talking partner of teaching AI and robot to the elementary school students to stimulate empathy for the intelligent machines. Two dialog types were defined. First, the query type dialog was defined as a robot’s answer to human questioning. Second, the phatic type dialogs were defined to express the personality of the robot. While the former type dialog is initiated by formulated questioning, the latter type response can even be induced by misrecognition of human speech.
Applying this simple method, the same unit sessions for each of the three classrooms on AI and robot were conducted. During the sessions, students’ burst of laughter was induced at 83% of the phatic type dialog, and the laughing response was found at 44% of the query type dialogs. By this representation, it became easier for the students to empathize with the robot.
After this session, a questionnaire survey on the preference of robot pet, on what the students wanted to talk with the robot that dreams at night, and on their view of life if AI robots replaced human workers was conducted. The results suggested that the students got to imagine a virtual subjectivity of the intelligent machines and considered a better life for the human with them.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Mechanisms of the emergence of consciousness in the human brain are a fundamental and attractive problem of brain science [1,2,3]. On the other hand, modeling of consciousness of intelligent robots is of engineering importance [4,5,6]. Also, humor and empathy are known as essential elements to build a friendly communication [7]. This may be true between human and intelligent machines. Empathy is regarded as the ability to figure out things from the other one’s self [8]. If we seek to have an in-depth insight into the intelligent sociable robot, it might be unsatisfactory to understand the robot simply as a mechanical system governed by an artificial algorism.
In this study, fifth-grade elementary school students who learn Artificial Intelligence (AI) and its influence on society have a classroom session to ask questions about AI and robots to one of the authors. The students surveyed AI and robots individually before the session. The teacher had a lecture based on the students’ questions in the first half of the session and answered the students’ questions directly in the rest of the time. In this session, the teacher introduced a humanoid robot NAO (Softbank Robotics) as a talking-partner. The robot had explanation type talks and phatic type small talks [9]. The latter phatic type dialog conveyed a humor personality and stimulated students’ laughing responses. The laughing response is known to enhance empathy [10]. In our previous study, we found the development of thought on life through the projection of their consciousness onto the robot partner in the successive classroom sessions of second-grade elementary students [11].
The purpose of this study is to observe that a humorous personality of a talking partner robot stimulates elementary school students’ empathy to the intelligent machines. We expect the empathy leads in-depth thinking on the intelligent machines that will exert a great impact on the future human life. Students’ thinking is surveyed by a simple description type questionnaire.
2 Method
2.1 Classroom and Humanoid Robot
Ninety-three fifth-grade male and female students in three classrooms participated in a pilot subject “research” that consisted of the learning on “Society 5.0, a super smart society” [12], home study on AI, AI-related technologies in the daily life and presentation, asking questions to one of the authors, visiting a museum of high technology, discussing on what were understood on AI, and having presentations on the impact of deep learning AI. The sessions studied in this paper for the above three classes were held in January 2019. The sessions of three classes were conducted successively in the same day. Each session duration was 45 min consisting of the time for teacher and robot’s explanation and the time for students’ questions.
A sociable humanoid robot used in this study as the talking partner of the lecture was NAO V5 (Softbank Robotics) [13] with the NAOqi operating system (version 2.1.4.13). The purpose of use of NAO robot is to let students have a feeling of being with a sociable robot that intervenes in human’s talk. The development environment was the Choregraphe version 2.1.4. NAO represented an interface of the intelligent machine that played a sociable behavior. During the classroom, the speech recognition threshold was set to 55% avoiding over responses caused by the frequent misrecognition. Also, NAO was set to talk in a sitting position on a teacher’s desk in front of the blackboard.
2.2 Data Acquisition
Before the session, a response card survey of the students’ questions on AI was conducted for two classes. Each student wrote down one question of their own. To detect the students’ responses to NAO’s talk, the sounds of the classrooms were recorded.
After the session, a questionnaire survey was conducted to extract an emotional aspect, and students’ feelings imagined as being along with AI or robots. The questions were as follows:
-
(1)
Do you like a robot that resembles human and animals? Please describe why do you like or not.
-
(2)
Imagine you have a robot that dreams at night. What do you talk to your robot?
-
(3)
In the future, robots will work for the people. If the robot workers replace the human workers, what do you like to do? If you still like to work yourself, what kind of job do you like to do? What do you like to do other than the job?
The above question (1) implies the life with an intelligent machine and empathy for it. Question (2) intends students to notice the inner side of their mind. Question (3) orients the students’ minds again to the public world and consider the relationship between intelligent machines and them. Students answered these questions writing short sentences. The texts were categorized on the basis of broader_narrower association [14].
3 Results and Discussion
3.1 Students’ Questions on AI
Students’ questions on AI was collected in two classrooms before the session. The total number of students who wrote their questions was 54 in the two classrooms. Some of the students’ writings included more than one question so that we collected 59 individual questions in total. These questions were categorized into four sub-categories and two super-categories as shown in Fig. 1 and Table 1.
Many of the students have a vague understanding of AI and have a will to understand the definition and its fundamental character, although they previously studied AI-related technologies that can be found in their personal life. At the same time, since the students studied the applications of AI to the daily life apparatus, they showed substantial interest in the possibility of AI and how it would change human life.
Table 1 shows the categorized types of students’ questions. The students questioned what AI was and what would happen with AI. At first glance, their questions seem biased towards knowledge about AI, rather than the questions from their mind. They might not have full reality on AI or life with robot support enough for holding inspiration, sympathy or a network of meanings for them. However, the questions represented in italic in Table 1, such as “does AI have a heart?” “does AI can take responsibility?” and “What is required of human in the AI society?” implies that the students sensitively consider an extension of humanity to the machine. These questions have significance in considering human life with the support of AI or robots, rather than in attaining knowledge of them.
3.2 Types of Dialogs
In the sessions of the classrooms, the humanoid robot NAO was used as a knowledge source. The teacher asked NAO a question on the intelligent machine from time to time in the session. Then NAO answered the question.
The dialogue can be written in the following form by using human utterance and robot response, as,
where, “utterances” is a set of utterances, and “responses” is a set of responses, respectively.
Dialogs were categorized into two types, namely a query type and a phatic type. The query type is a question-and-answer type, in which the relationship between the utterance and response are determined by their contexts and independent on the situation. However, some of the explanations were based on the view of machines as a “robot’s personality.” For example, NAO asks the students, “We have algorisms inside, what kind of algorism do you have?” Also, the utterances are shaped in a formulated style by using interrogative words. No abbreviation was used in these utterances. These conditions led the sentences relatively long so that the robot spoke the response sentence only when the utterance sentence was fully recognized.
The phatic type dialog is a more casual response that is spoken as a heart movement, such as the expression of feeling, acceptance or refusal, confirmation or embarrassment, muttering and nodding, small talks, etc. The dialog of this type is not the necessary condition of the knowledge transfer but may affect the human mind in the sense of empathy. A fixed “personality” of the robot was assumed for the classroom session. We assume that a “virtual personality” may be expressed by a function f as,
For the present session, an encouraging personality was considered, and a set of responses that has encouraging and humorous words was prepared for f.
Further, many of the above phatic type utterances consisted of short words. As a result, the misrecognition of undefined words as the defined phatic words was likely to occur even at the level of speech recognition threshold of 55%. Thus, the occasional phatic type response can often be evoked by the misrecognition of the human’s short utterances. In the classroom sessions described below, occasional recognitions evoked 88% of the robot’s phatic responses unexpectedly.
3.3 Students’ Responses to the Robot’s Replies in the Classroom Session
Within the session talks of three classrooms, NAO expressed query type responses 16 ± 1 (average ± standard deviation) times led by the instructor’s utterances, and showed phatic type responses 11 ± 3 times for each session, as shown by the black bars in Fig. 2. They are compared with the number of the subsequent reaction of the students, particularly laughing, as 7 ± 3 for query type and 9 ± 2, respectively. This means 44% vs. 82% is laughing for query vs. phatic reactions, respectively.
As seen in Fig. 2, laughter took place to 82% of the robot’s phatic responses. Many of the cases were burst out laughing. The students listened particularly to the robot’s answers to the previous questions summarized in Table 1. Still, the students showed some responses to nearly half the robot’s query talks. Students’ friendly responses even to the query dialog are supposed to be related with the personality making in the corresponding dialog as well as the friendly appearance and voice of NAO.
3.4 Students’ Acceptance of Robot and AI
As mentioned in Sect. 3.1, the students’ interests were mainly directed to the knowledge of AI. This was also expected from the original lesson plan. Then, we conducted an experimental session of this study with NAO as a talking partner. In this section, we discuss the students’ acceptance and empathy with the intelligent machines after the session by analyzing the results of the questionnaire survey.
Figure 3 shows the results of the question of whether the student likes a robot that resembles human and animals. 73% of students answered that they like it, and 23% of them do not like it. In the classroom session, the students did not discussion this topic, and the results are expected to reflect individual preference directly.
The session with NAO might have affected the students’ acceptance of intelligent machines through the anthropomorphism of the robot. Table 2 shows the reasons why the students like the robot as a substitute for the pet animals. Students’ answers were summarized into four categories to make the tendency of awareness clearer. Table 2 shows hierarchically categorized types of reasons that the students listed. The students wrote one or more reasons. The numbers in the parentheses are numbers of written instances of reasons. Also, Fig. 4 shows the numbers of written instances for each reason type.
As seen in Table 2 and Fig. 4, the intuitive and anthropomorphism type reasons are overwhelming and 86% of the comments of those who like robot pets. These types of reasons suggest that the students who accept robot pets might instead expect the technology as a friend and a partner of life. Viewing of personality interface of NAO presentation possibly induced this trend of awareness of the students. Before the session, AI and robots were the subjects that the students wanted to know about. During the session, the sociable intelligent machine became a subject of emotion.
The “communicate, friendliness” category particularly suggests emotional factors of communication. Students seem to feel the robot cute and to thank it if the robot works hard to communicate with them.
Table 3 shows the categorized types of reasons why the students do not like the robot as a substitute for the pet. The major criticism of the robot pet is that since the machine does not have real life, the human cannot feel empathy with it. Also, the robots are programmed to pretend human behavior. As a result, one cannot see an expression from its own heart. Other types of reasons are related to the danger of a machine that cannot be adequately controlled. These reasons show that the students who dislike the robot pet do not empathize the robot and recognize it merely as a machine having no inherent consciousness.
Considering all of the super categories from (a) to (d) in Tables 2 and 3, the category (a) intuitive and anthropomorphism means that the students empathize the pet robot that may evoke richness in their heart. The percentage of the reason text instances of category (a) was 65%. This percentage is smaller than the percentage of students who answered they like the robot pet, as shown in Fig. 3. Thus, some of the students like the robot pet because of its objective function or mechanism. The robot moves and speaks by its algorithm. The response it makes and let the students feel funny comes from an algorithm. The majority of the students pay attention to the state of their mind of forming the impression.
3.5 What Do the Students Want to Talk with the Robot that Dreams at Night?
Question (2) described in Sect. 2.2 assumes that dreams come from continuously working brain that unifies many latent elements of wishes. The students who feed pets experience that many of the pet animals look dreaming while sleeping. Elementary students can find out many similarity or analog between human and humanoid robot [11]. Although dreaming seems common in human and animals, dreaming remains not required a function for the robots. Therefore, this question was intended to let the students consider the robot that might have subjectivity and consciousness in addition to sensing. Note that the teacher did not explain this intention to the students.
Table 4 shows a hierarchical arrangement of categorized students’ answer types. The supercategory (a), the inner side, means that the students attempt to talk to the robot’s heart. Figure 5 shows a comparison of the numbers of text instances. The inner side talk was 53% of all text instances. The robot’s subjectivity and consciousness are of interest in talking with each other. Also, the text instances of the sub-category of the information acquisition also suggested a friendliness between human and robot.
In the categories (b) informational and (c) role of robot, there found the questions on the future, possibility, or risk of the intelligent machines. If the robots do not have consciousness and subjectivity, these questions were not asked to the intelligent machines. Instead, the students would ask the questions to the developer or planner. The present session was meant for asking questions on AI to the teacher that develops the robot program.
Although the number was only two, the students wrote that they want to be taken care by the robot. This means if the robot dreams at night and has a subjectivity they will admit the intelligent machine as a life supporter. One can make use of the machines that do not have subjectivity or consciousness. However, when it comes to being taken care of oneself, one may be anxious if the machine is no more than an algorithm. One may hope essentially to empathize with the machine feeling the machine’s consciousness.
3.6 Future Life with Robots
The last question (2.2-3) asked what the students are interested in doing under a situation that the human is not always obliged to work. Overall, education in Japan is meant to develop young people to be an autonomous and subjective citizen that works or contributes to society. The teacher explained the possibility of a robot with a general-purpose AI, which can work flexibly like a human. More people might choose freely to work or not by one’s will under such a circumstance. At this time of the session, the students can assume the subjectivity of intelligent machines as an essence of the human-machine interface (Fig. 6).
Before the session, as shown in Table 1, the students’ questions were mainly technical. The questions related with humanity, as shown in italic in Table 1 were nine in all 59. As a result of the questionnaire survey, those who clearly stated that they like to work was 45% after the session. The breakdown of what jobs they want to obtain was shown in Fig. 7.
The students were quite positive in working. Notably, 32% of those who want to work stated they want to develop, drive, or manage the AI robots by themselves. This tendency may be invoked through this curriculum.
26% of them preferred creative works such as the designer, architecture, town planner, writer, chef, pâtissier, artists, and broadcaster. 24% of the students preferred to be specialists, such as prosecutor, lawyer, police, teacher, sports player, game player, or the jobs in which communication is quite essential. 17% of the students preferred to have jobs of hospitality, such as animal feeder, animal-related works, doctor, nurse, or shoppers. These jobs are considered to be the students’ ordinary dreams.
On the other hand, things that the students like to do other than jobs were, for instance, as follows: leisure, play, hobby, travel, cooking, home keeping, games, etc. These usually are done in the vacation time, and the idea of general-purpose AI robot has little influence on the students’ ideas of daily life.
Instead, the things not limited to the concept of work were intriguing as follows, things that the robots cannot do (8), I want to do things that the human can do (6), AI as a partner (6), help other people, and contribute to the world (3), staying comfortable (1), spending diverse time (1). Here, numbers in the parentheses are the number of text instances of the types of descriptions indicated by the texts. Indeed, the text instances are descriptive, in comparison with the above nominal ones, and these texts reflect imaginations of the world in which AI robot works for humans. A common characteristic of these types of descriptions is not specified jobs. Rather, characteristic of whether it is an occupation or not may not be necessarily effective. The students describe human actions or their will, and this may be a chance to consider how they like to live.
4 Concluding Remarks
This paper presented a case study of a classroom session on the AI and a robot with a humanoid robot NAO talking-partner for fifth-grade elementary school students. The students surveyed on AI before the session and prepared the questions on AI individually. To construct dialogs, the query type dialog for providing knowledge and the phatic type dialog for occasional small talks with personality effect was created. During the sessions, laughter occurred at 82% of the phatic type response. Meanwhile, laughter occurred at 44% of the query type dialog.
The students’ questions prepared for the session were mainly on the knowledge of AI. Whereas a few questions were on an extension of humanity to the machines. After the session with NAO, the students wrote answer texts to three questions. The answer texts were categorized and summarized into type hierarchy tables.
First, it was found that although they had an objective impression on AI and robot, many of them obtained subjective views through the image of the robot as a pet and can feel richness in emotion by the empathy on the robot pet.
Second, the students considered a dialog with an intelligent machine which has subjectivity. The students’ texts implied that they expect communication between human and robot’s subjectivity.
Third, as for a future society in which the general-purpose robots work in place of humans, a majority of students expected they would do their favorite jobs or free time activities. However, it was notable that other students described how they could spend better time as a human.
This case study implies that personality designing affected the students’ emotions and let them imagine conscious intelligent machines. The students noticed that if they could recognize a machine’s subjectivity, they can communicate with it, beyond the knowledge of it, and consider better life with it as the humans.
References
Dehaene, S.: Consciousness and the Brain: Deciphering How the Brain Codes Our Thoughts. Penguin, London (2014)
Massimini, M., Tononi, G.: Nulla di più grande. Baldini & Castoldi, Milan (2013)
Koch, C., Massimini, M., Boly, M., Tononi, G.: Neural correlates of consciousness: progress and problems. Nat. Rev. Neurosci. 17, 307–321 (2016)
Kubota, N., Kojima, F., Fukuda, T.: Self-consciousness and emotion for a pet robot with structured intelligence. In: Proceedings of Joint 9th IFSA World Congress and 20th NAFIPS International Conference, vol. 5, pp. 2786–2791 (2001)
Takeno, J., Akimoto, S.: A conscious robot to perceive the unknown. In: 2010 IEEE International Conference on Systems, Man and Cybernetics, pp. 1375–1379 (2010)
Komatsu, T., Takeno, J.: A conscious robot that expects emotions. In: 2011 IEEE International Conference on Industrial Technology, pp. 15–20 (2011)
Wu, S.-R.: Humor and empathy: developing students’ empathy through teaching robot to tell English jokes. In: 2nd IEEE International Conference on Digital Game and Intelligent Toy Enhanced Learning, pp. 213–214 (2017)
Edirisinghe, C., Nakatsu, R., Widodod, J.: Empathy as a factor for a new social contract. In: 2013 International Conference on Culture and Computing, pp. 161–162 (2013)
Bilandzic, M., Filonik, D., Gross, M., Hackel, A., Mangesius, H., Kremar, H.: A mobile application to support phatic communication in the hybrid space. In: 2009 Sixth International Conference on Information Technology: New Generations, pp. 1517–1521 (2009)
Egawa, S., Sejima, Y., Sato, Y., Watanabe, T.: Laughing-driven pupil response system for inducing empathy. In: Proceedings of the 2016 IEEE/SICE International Symposium on System Integration, pp. 520–525 (2016)
Omokawa, R., Matsuura, S.: Development of thought using a humanoid robot in an elementary school classroom. In: Antona, M., Stephanidis, C. (eds.) UAHCI 2018. LNCS, vol. 10908, pp. 541–552. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-92052-8_43
Cabinet Office Government of Japan. https://www.gov-online.go.jp/cam/s5/eng/index.html. Accessed 12 Feb 2019
Softbank Robotics. https://www.softbankrobotics.com/emea/en. Accessed 12 Feb 2019
Matsuura, S.: Development of a trans-field learning system based on multidimensional topic maps, linked topic maps. In: Fifth International Conference on Topic Maps Research and Applications TMRA 2009, University of Leipzig, Institute fur Informatik, vol. 19, pp. 83–89 (2009)
Acknowledgments
We would like to thank Masaru Tashiro and Ryo Sugimoto, Tokyo Gakugei University Oizumi Elementary School for providing us an opportunity to have classroom sessions. This work was partly funded by a Grant-in-Aid for Scientific Research (C) 15K00912 from the Ministry of Education, Culture, Sports, Science and Technology.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Omokawa, R., Kobayashi, M., Matsuura, S. (2019). Expressing the Personality of a Humanoid Robot as a Talking Partner in an Elementary School Classroom. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. Theory, Methods and Tools. HCII 2019. Lecture Notes in Computer Science(), vol 11572. Springer, Cham. https://doi.org/10.1007/978-3-030-23560-4_36
Download citation
DOI: https://doi.org/10.1007/978-3-030-23560-4_36
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-23559-8
Online ISBN: 978-3-030-23560-4
eBook Packages: Computer ScienceComputer Science (R0)