Keywords

1 Introduction

Mechanisms of the emergence of consciousness in the human brain are a fundamental and attractive problem of brain science [1,2,3]. On the other hand, modeling of consciousness of intelligent robots is of engineering importance [4,5,6]. Also, humor and empathy are known as essential elements to build a friendly communication [7]. This may be true between human and intelligent machines. Empathy is regarded as the ability to figure out things from the other one’s self [8]. If we seek to have an in-depth insight into the intelligent sociable robot, it might be unsatisfactory to understand the robot simply as a mechanical system governed by an artificial algorism.

In this study, fifth-grade elementary school students who learn Artificial Intelligence (AI) and its influence on society have a classroom session to ask questions about AI and robots to one of the authors. The students surveyed AI and robots individually before the session. The teacher had a lecture based on the students’ questions in the first half of the session and answered the students’ questions directly in the rest of the time. In this session, the teacher introduced a humanoid robot NAO (Softbank Robotics) as a talking-partner. The robot had explanation type talks and phatic type small talks [9]. The latter phatic type dialog conveyed a humor personality and stimulated students’ laughing responses. The laughing response is known to enhance empathy [10]. In our previous study, we found the development of thought on life through the projection of their consciousness onto the robot partner in the successive classroom sessions of second-grade elementary students [11].

The purpose of this study is to observe that a humorous personality of a talking partner robot stimulates elementary school students’ empathy to the intelligent machines. We expect the empathy leads in-depth thinking on the intelligent machines that will exert a great impact on the future human life. Students’ thinking is surveyed by a simple description type questionnaire.

2 Method

2.1 Classroom and Humanoid Robot

Ninety-three fifth-grade male and female students in three classrooms participated in a pilot subject “research” that consisted of the learning on “Society 5.0, a super smart society” [12], home study on AI, AI-related technologies in the daily life and presentation, asking questions to one of the authors, visiting a museum of high technology, discussing on what were understood on AI, and having presentations on the impact of deep learning AI. The sessions studied in this paper for the above three classes were held in January 2019. The sessions of three classes were conducted successively in the same day. Each session duration was 45 min consisting of the time for teacher and robot’s explanation and the time for students’ questions.

A sociable humanoid robot used in this study as the talking partner of the lecture was NAO V5 (Softbank Robotics) [13] with the NAOqi operating system (version 2.1.4.13). The purpose of use of NAO robot is to let students have a feeling of being with a sociable robot that intervenes in human’s talk. The development environment was the Choregraphe version 2.1.4. NAO represented an interface of the intelligent machine that played a sociable behavior. During the classroom, the speech recognition threshold was set to 55% avoiding over responses caused by the frequent misrecognition. Also, NAO was set to talk in a sitting position on a teacher’s desk in front of the blackboard.

2.2 Data Acquisition

Before the session, a response card survey of the students’ questions on AI was conducted for two classes. Each student wrote down one question of their own. To detect the students’ responses to NAO’s talk, the sounds of the classrooms were recorded.

After the session, a questionnaire survey was conducted to extract an emotional aspect, and students’ feelings imagined as being along with AI or robots. The questions were as follows:

  1. (1)

    Do you like a robot that resembles human and animals? Please describe why do you like or not.

  2. (2)

    Imagine you have a robot that dreams at night. What do you talk to your robot?

  3. (3)

    In the future, robots will work for the people. If the robot workers replace the human workers, what do you like to do? If you still like to work yourself, what kind of job do you like to do? What do you like to do other than the job?

The above question (1) implies the life with an intelligent machine and empathy for it. Question (2) intends students to notice the inner side of their mind. Question (3) orients the students’ minds again to the public world and consider the relationship between intelligent machines and them. Students answered these questions writing short sentences. The texts were categorized on the basis of broader_narrower association [14].

3 Results and Discussion

3.1 Students’ Questions on AI

Students’ questions on AI was collected in two classrooms before the session. The total number of students who wrote their questions was 54 in the two classrooms. Some of the students’ writings included more than one question so that we collected 59 individual questions in total. These questions were categorized into four sub-categories and two super-categories as shown in Fig. 1 and Table 1.

Fig. 1.
figure 1

Categorized questions on AI

Table 1. Students’ questions on AI to the teacher. The left two columns show the sup- and sub-category names of questions. The right column shows the instance names of the questions. Numbers in parentheses are the number of actual questions (text instances). The words written in Italic imply the students’ sense of humanity.

Many of the students have a vague understanding of AI and have a will to understand the definition and its fundamental character, although they previously studied AI-related technologies that can be found in their personal life. At the same time, since the students studied the applications of AI to the daily life apparatus, they showed substantial interest in the possibility of AI and how it would change human life.

Table 1 shows the categorized types of students’ questions. The students questioned what AI was and what would happen with AI. At first glance, their questions seem biased towards knowledge about AI, rather than the questions from their mind. They might not have full reality on AI or life with robot support enough for holding inspiration, sympathy or a network of meanings for them. However, the questions represented in italic in Table 1, such as “does AI have a heart?” “does AI can take responsibility?” and “What is required of human in the AI society?” implies that the students sensitively consider an extension of humanity to the machine. These questions have significance in considering human life with the support of AI or robots, rather than in attaining knowledge of them.

3.2 Types of Dialogs

In the sessions of the classrooms, the humanoid robot NAO was used as a knowledge source. The teacher asked NAO a question on the intelligent machine from time to time in the session. Then NAO answered the question.

The dialogue can be written in the following form by using human utterance and robot response, as,

$$ {\text{u:}}\left( {\sim{\text{utteraces}}} \right) \, \sim{\text{responses}}, $$

where, “utterances” is a set of utterances, and “responses” is a set of responses, respectively.

Dialogs were categorized into two types, namely a query type and a phatic type. The query type is a question-and-answer type, in which the relationship between the utterance and response are determined by their contexts and independent on the situation. However, some of the explanations were based on the view of machines as a “robot’s personality.” For example, NAO asks the students, “We have algorisms inside, what kind of algorism do you have?” Also, the utterances are shaped in a formulated style by using interrogative words. No abbreviation was used in these utterances. These conditions led the sentences relatively long so that the robot spoke the response sentence only when the utterance sentence was fully recognized.

The phatic type dialog is a more casual response that is spoken as a heart movement, such as the expression of feeling, acceptance or refusal, confirmation or embarrassment, muttering and nodding, small talks, etc. The dialog of this type is not the necessary condition of the knowledge transfer but may affect the human mind in the sense of empathy. A fixed “personality” of the robot was assumed for the classroom session. We assume that a “virtual personality” may be expressed by a function f as,

$$ f\left( {\text{phatic type utterances}} \right) \, = {\text{phatic type responses}}. $$

For the present session, an encouraging personality was considered, and a set of responses that has encouraging and humorous words was prepared for f.

Further, many of the above phatic type utterances consisted of short words. As a result, the misrecognition of undefined words as the defined phatic words was likely to occur even at the level of speech recognition threshold of 55%. Thus, the occasional phatic type response can often be evoked by the misrecognition of the human’s short utterances. In the classroom sessions described below, occasional recognitions evoked 88% of the robot’s phatic responses unexpectedly.

3.3 Students’ Responses to the Robot’s Replies in the Classroom Session

Within the session talks of three classrooms, NAO expressed query type responses 16 ± 1 (average ± standard deviation) times led by the instructor’s utterances, and showed phatic type responses 11 ± 3 times for each session, as shown by the black bars in Fig. 2. They are compared with the number of the subsequent reaction of the students, particularly laughing, as 7 ± 3 for query type and 9 ± 2, respectively. This means 44% vs. 82% is laughing for query vs. phatic reactions, respectively.

Fig. 2.
figure 2

Stacked bar chart to compare the average number of robot’s response and the subsequent students’ reaction voices in the cases of the query type and phatic type dialogs.

As seen in Fig. 2, laughter took place to 82% of the robot’s phatic responses. Many of the cases were burst out laughing. The students listened particularly to the robot’s answers to the previous questions summarized in Table 1. Still, the students showed some responses to nearly half the robot’s query talks. Students’ friendly responses even to the query dialog are supposed to be related with the personality making in the corresponding dialog as well as the friendly appearance and voice of NAO.

3.4 Students’ Acceptance of Robot and AI

As mentioned in Sect. 3.1, the students’ interests were mainly directed to the knowledge of AI. This was also expected from the original lesson plan. Then, we conducted an experimental session of this study with NAO as a talking partner. In this section, we discuss the students’ acceptance and empathy with the intelligent machines after the session by analyzing the results of the questionnaire survey.

Figure 3 shows the results of the question of whether the student likes a robot that resembles human and animals. 73% of students answered that they like it, and 23% of them do not like it. In the classroom session, the students did not discussion this topic, and the results are expected to reflect individual preference directly.

Fig. 3.
figure 3

The number of students who like or not like human-like or animal-like robot.

The session with NAO might have affected the students’ acceptance of intelligent machines through the anthropomorphism of the robot. Table 2 shows the reasons why the students like the robot as a substitute for the pet animals. Students’ answers were summarized into four categories to make the tendency of awareness clearer. Table 2 shows hierarchically categorized types of reasons that the students listed. The students wrote one or more reasons. The numbers in the parentheses are numbers of written instances of reasons. Also, Fig. 4 shows the numbers of written instances for each reason type.

Table 2. Students’ reasons why they like robots as a substitute for pet animals.
Fig. 4.
figure 4

Numbers of students’ written instances of reasons categorized into four supertypes of reasons why they like human- and animal-like robot.

As seen in Table 2 and Fig. 4, the intuitive and anthropomorphism type reasons are overwhelming and 86% of the comments of those who like robot pets. These types of reasons suggest that the students who accept robot pets might instead expect the technology as a friend and a partner of life. Viewing of personality interface of NAO presentation possibly induced this trend of awareness of the students. Before the session, AI and robots were the subjects that the students wanted to know about. During the session, the sociable intelligent machine became a subject of emotion.

The “communicate, friendliness” category particularly suggests emotional factors of communication. Students seem to feel the robot cute and to thank it if the robot works hard to communicate with them.

Table 3 shows the categorized types of reasons why the students do not like the robot as a substitute for the pet. The major criticism of the robot pet is that since the machine does not have real life, the human cannot feel empathy with it. Also, the robots are programmed to pretend human behavior. As a result, one cannot see an expression from its own heart. Other types of reasons are related to the danger of a machine that cannot be adequately controlled. These reasons show that the students who dislike the robot pet do not empathize the robot and recognize it merely as a machine having no inherent consciousness.

Table 3. Students’ reasons why they do not like robots as a substitute for pet animals.

Considering all of the super categories from (a) to (d) in Tables 2 and 3, the category (a) intuitive and anthropomorphism means that the students empathize the pet robot that may evoke richness in their heart. The percentage of the reason text instances of category (a) was 65%. This percentage is smaller than the percentage of students who answered they like the robot pet, as shown in Fig. 3. Thus, some of the students like the robot pet because of its objective function or mechanism. The robot moves and speaks by its algorithm. The response it makes and let the students feel funny comes from an algorithm. The majority of the students pay attention to the state of their mind of forming the impression.

3.5 What Do the Students Want to Talk with the Robot that Dreams at Night?

Question (2) described in Sect. 2.2 assumes that dreams come from continuously working brain that unifies many latent elements of wishes. The students who feed pets experience that many of the pet animals look dreaming while sleeping. Elementary students can find out many similarity or analog between human and humanoid robot [11]. Although dreaming seems common in human and animals, dreaming remains not required a function for the robots. Therefore, this question was intended to let the students consider the robot that might have subjectivity and consciousness in addition to sensing. Note that the teacher did not explain this intention to the students.

Table 4 shows a hierarchical arrangement of categorized students’ answer types. The supercategory (a), the inner side, means that the students attempt to talk to the robot’s heart. Figure 5 shows a comparison of the numbers of text instances. The inner side talk was 53% of all text instances. The robot’s subjectivity and consciousness are of interest in talking with each other. Also, the text instances of the sub-category of the information acquisition also suggested a friendliness between human and robot.

Table 4. Students’ answers on what they like to talk with a pet robot that dreams at night.
Fig. 5.
figure 5

Categorized topics students want to talk with the robot that dreams at night.

In the categories (b) informational and (c) role of robot, there found the questions on the future, possibility, or risk of the intelligent machines. If the robots do not have consciousness and subjectivity, these questions were not asked to the intelligent machines. Instead, the students would ask the questions to the developer or planner. The present session was meant for asking questions on AI to the teacher that develops the robot program.

Although the number was only two, the students wrote that they want to be taken care by the robot. This means if the robot dreams at night and has a subjectivity they will admit the intelligent machine as a life supporter. One can make use of the machines that do not have subjectivity or consciousness. However, when it comes to being taken care of oneself, one may be anxious if the machine is no more than an algorithm. One may hope essentially to empathize with the machine feeling the machine’s consciousness.

3.6 Future Life with Robots

The last question (2.2-3) asked what the students are interested in doing under a situation that the human is not always obliged to work. Overall, education in Japan is meant to develop young people to be an autonomous and subjective citizen that works or contributes to society. The teacher explained the possibility of a robot with a general-purpose AI, which can work flexibly like a human. More people might choose freely to work or not by one’s will under such a circumstance. At this time of the session, the students can assume the subjectivity of intelligent machines as an essence of the human-machine interface (Fig. 6).

Fig. 6.
figure 6

Students’ answers on what they want to do if the robots work for them.

Before the session, as shown in Table 1, the students’ questions were mainly technical. The questions related with humanity, as shown in italic in Table 1 were nine in all 59. As a result of the questionnaire survey, those who clearly stated that they like to work was 45% after the session. The breakdown of what jobs they want to obtain was shown in Fig. 7.

Fig. 7.
figure 7

Numbers of text instances of actual jobs in four categories.

The students were quite positive in working. Notably, 32% of those who want to work stated they want to develop, drive, or manage the AI robots by themselves. This tendency may be invoked through this curriculum.

26% of them preferred creative works such as the designer, architecture, town planner, writer, chef, pâtissier, artists, and broadcaster. 24% of the students preferred to be specialists, such as prosecutor, lawyer, police, teacher, sports player, game player, or the jobs in which communication is quite essential. 17% of the students preferred to have jobs of hospitality, such as animal feeder, animal-related works, doctor, nurse, or shoppers. These jobs are considered to be the students’ ordinary dreams.

On the other hand, things that the students like to do other than jobs were, for instance, as follows: leisure, play, hobby, travel, cooking, home keeping, games, etc. These usually are done in the vacation time, and the idea of general-purpose AI robot has little influence on the students’ ideas of daily life.

Instead, the things not limited to the concept of work were intriguing as follows, things that the robots cannot do (8), I want to do things that the human can do (6), AI as a partner (6), help other people, and contribute to the world (3), staying comfortable (1), spending diverse time (1). Here, numbers in the parentheses are the number of text instances of the types of descriptions indicated by the texts. Indeed, the text instances are descriptive, in comparison with the above nominal ones, and these texts reflect imaginations of the world in which AI robot works for humans. A common characteristic of these types of descriptions is not specified jobs. Rather, characteristic of whether it is an occupation or not may not be necessarily effective. The students describe human actions or their will, and this may be a chance to consider how they like to live.

4 Concluding Remarks

This paper presented a case study of a classroom session on the AI and a robot with a humanoid robot NAO talking-partner for fifth-grade elementary school students. The students surveyed on AI before the session and prepared the questions on AI individually. To construct dialogs, the query type dialog for providing knowledge and the phatic type dialog for occasional small talks with personality effect was created. During the sessions, laughter occurred at 82% of the phatic type response. Meanwhile, laughter occurred at 44% of the query type dialog.

The students’ questions prepared for the session were mainly on the knowledge of AI. Whereas a few questions were on an extension of humanity to the machines. After the session with NAO, the students wrote answer texts to three questions. The answer texts were categorized and summarized into type hierarchy tables.

First, it was found that although they had an objective impression on AI and robot, many of them obtained subjective views through the image of the robot as a pet and can feel richness in emotion by the empathy on the robot pet.

Second, the students considered a dialog with an intelligent machine which has subjectivity. The students’ texts implied that they expect communication between human and robot’s subjectivity.

Third, as for a future society in which the general-purpose robots work in place of humans, a majority of students expected they would do their favorite jobs or free time activities. However, it was notable that other students described how they could spend better time as a human.

This case study implies that personality designing affected the students’ emotions and let them imagine conscious intelligent machines. The students noticed that if they could recognize a machine’s subjectivity, they can communicate with it, beyond the knowledge of it, and consider better life with it as the humans.