Skip to main content

Advertisement

Log in

The simulation of an emotional robot implemented with fuzzy logic

  • Focus
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

Recently, researchers have tried to better understand human behaviors so as to let robots act in more human ways, which means a robot may have its own emotions defined by its designers. To achieve this goal, in this study, we designed and simulated a robot, named Shiau_Lu, which is empowered with six universal human emotions, including happiness, anger, fear, sadness, disgust and surprise. When we input a sentence to Shiau_Lu through voice, it recognizes the sentence by invoking the Google speech recognition method running on an Android system, and outputs a sentence to reveal its current emotional states. Each input sentence affects the strength of the six emotional variables used to represent the six emotions, one corresponding to one. After that, the emotional variables will change into new states. The consequent fuzzy inference process infers and determines the most significant emotion as the primary emotion, with which an appropriate output sentence as a response of the input is chosen from its Output-sentence database. With the new states of the six emotional variables, when the robot encounters another sentence, the above process repeats and another output sentence is then selected and replied. Artificial intelligence and psychological theories of human behaviors have been applied to the robot to simulate how emotions are influenced by the outside world through languages. In fact, the robot may help autistic children to interact more with the world around them and relate themselves well to the outside world.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  • Andreu J, Angelov P (2013) Towards generic human activity recognition for ubiquitous applications. J Ambient Intell Humaniz Comput 4(2):155–156

    Article  Google Scholar 

  • Bellman RE, Kalaba RE, Zadeh LA (1964) Abstraction and pattern classification. RAND Corporation

  • Buckley JJ (1992) Theory of the fuzzy controller: an introduction. Fuzzy Sets Syst 51(3):249–258

    Article  MATH  MathSciNet  Google Scholar 

  • Carrino F, Sokhn M, Le Calvé A, Mugellini E, Khaled OA (2013) Personal information management based on semantic technologies. J Ambient Intell Humaniz Comput 4(3):401–407

    Article  Google Scholar 

  • Chen Y, Chen Y (2006) Affective computing model based on rough fuzzy sets. IEEE Int Conf Cogn Inf 2:835–838

    Google Scholar 

  • Chen CW, Kouh JS, Tsai JF (2013) Maneuvering modeling and simulation of AUV dynamic systems with Euler–Rodriguez quaternion method. China Ocean Eng 27(3):403–416

    Google Scholar 

  • Coeckelbergh M (2012) Are emotional robots deceptive? IEEE Trans Affect Comput 3(4):388–393

    Google Scholar 

  • Ekman P (2003) Emotions revealed: recognizing faces and feelings to improve communication and emotional life, Holt Paperbacks, 2nd edn (March 20, 2007). Henry Holt and Company, New York

  • Fong B, Westerink J (2012) Affective computing in consumer electronics. IEEE Trans Affect Comput 3(2):129–131

    Article  Google Scholar 

  • Fullér R, Zimmermann H-J (1993) Fuzzy reasoning for solving fuzzy mathematical programming problems. Fuzzy Sets Syst 60(2):121–133

    Article  MATH  Google Scholar 

  • He T, Chen H (2010) The mining and analysis of affective law. In: International conference on computational and information sciences, pp 1130–1133

  • Lanatà A, Valenza G, Scilingo EP (2013) Eye gaze patterns in emotional pictures. J Ambient Intell Humaniz Comput 4(6):705–715

    Article  Google Scholar 

  • Lee CM, Narayanan S, Pieraccini R (2001) Recognition of negative emotions from the speech signal. In: IEEE workshop on automatic speech recognition and understanding, pp 240–243

  • Lee CM, Narayanan S, Pieraccini R (2001) Recognition of negative emotions from the speech signal. In: IEEE workshop on automatic speech recognition and understanding, pp 240–243

  • Lin S, Zhigang L (2012) Generation of basic emotions for virtual human in the virtual environment. In: IEEE symposium on electrical and electronics, engineering, pp 585–588

  • Mamdani EH, Assilian S (1975) An experiment in linguistic synthesis with a fuzzy logic controller. Int J Man Mach Stud 7(1):1–13

    Article  MATH  Google Scholar 

  • Negoita C (1981) The current interest in fuzzy optimization. Fuzzy Sets Syst 6(3):261–269

    Article  MATH  MathSciNet  Google Scholar 

  • Pedrycz W (2010) Human centricity in computing with fuzzy sets: an interpretability quest for higher order granular constructs. J Ambient Intell Humaniz Comput 1(1):65–74

    Article  MathSciNet  Google Scholar 

  • Scherer KR (2005) What are emotions? And how can they be measured? Social Sci Inf 44(4):695–729

    Article  Google Scholar 

  • Sugeno M (1985) An introductory survey of fuzzy control. Inf Sci 36(1–2):59–83

    Article  MATH  MathSciNet  Google Scholar 

  • Wu D (2012) Fuzzy sets and systems in building closed-loop affective computing systems for human–computer interaction: advances and new research directions. In: IEEE international conference on fuzzy systems, pp 1–8

  • Yang M-S (1993) On a class of fuzzy classification maximum likelihood procedures. Fuzzy Sets Syst 57(3):365–375

    Article  MATH  Google Scholar 

  • Yang M-S (1993) A survey of fuzzy clustering. Math Comput Model 18(11):1–16

    Article  MATH  Google Scholar 

  • Young L, Camprodon JA, Hauser M, Pascual-Leone A, Saxe R (2010) Disruption of the right temporoparietal junction with transcranial magnetic stimulation reduces the role of beliefs in moral judgments. Proc Natl Acad Sci 107:6753–6758

    Article  Google Scholar 

  • Zadeh LA (1965) Information and control. Fuzzy sets 8(3):338–353

    MATH  MathSciNet  Google Scholar 

  • Zhao Y, Wang X, Goubran M, Whalen T, Petriu EM (2013) Human emotion and cognition recognition from body language of the head using soft computing techniques. J Ambient Intell Humaniz Comput 4(1):121–140

    Article  Google Scholar 

  • Zimmermann H-J (1976) Description and optimization of fuzzy system. Int J Gen Syst 2(4):209–215

    Article  MATH  Google Scholar 

Download references

Acknowledgments

The work was partially supported by TungHai University under the project GREENs and the National Science Council, Taiwan under Grants NSC 102-2221-E-029-003-MY3, and NSC 100-2221-E-029-018.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fang-Yie Leu.

Additional information

Communicated by A. Castiglione.

Appendices

Appendix A

The sentences collected in the questionnaire are as follows. “Please assess emotional values from the viewpoints of a girl if the following sentences are heard by you. Also, please indicate what the appropriate offset value in your mind for this numerical design is.”

figure a

Will you marry me?

\(\mathbf {\ldots }\)

figure b

The average value of the 20 offsets collected is \(\pm 2.2(=( {0*0})+( {1*3})+( {2*11})+( {3*5})+( {4*1}))/20\)

we choose the value range between \(-2\) and 2 as the tuning values of newly produced emotions.

figure c

Appendix B

Table 6 lists a part of the sentences collected in the Input-sentence database and their corresponding emotional updating variables: happiness (E\(_{\mathrm{h}})\), anger (E\(_{\mathrm{a}})\), fear (E\(_{\mathrm{f}})\), sadness (E\(_{\mathrm{sad}})\), disgust (E\(_{\mathrm{d}})\), and surprise (E\(_{\mathrm{sur}})\).

Table 6 A part of the input sentences and the scores of their emotional updating variables collected in the Input-sentence database

Appendix C

Table 7 list a part of the content of the output-sentence database. In fact, for the same input sentence and the same level of the same emotion, Shiau-Lu may select different output sentences. For example, when Shiau_Lu is now in its happiness mood, for the input sentence “You look so beautiful!” there are two choices: “You always make me feel happy!” and “Thanks.” The choice is performed based on a random function.

Table 7 The 18 non-neutral-sentence lists of the output-sentence database

Appendix D

Table 8 lists “Value range” and “Judged as” of happiness, which is a positive emotion, and fear, which is a negative emotion, corresponding to the items along the X-axis shown in Fig. 7.

Appendix E

Several input sentences and their responses are put together as a script in which S is the input sentence, O is the output sentence and all the inputs are given one by one without any delay. So there is no attenuation. The initial emotions are all neutral, i.e., (happiness, anger, fear, sadness, disgust, surprise)=(0, 0, 0, 0, 0, 0), and S= “Do you want to eat worm?” The purpose is to increase the degree of Shiau_Lu’s disgust. Figure 15a summarizes the process of this input sentence. The six emotional updating variables are E\(_{\mathrm{h}}\):-7, E\(_{\mathrm{a}}\):4, E\(_{\mathrm{f}}\):7, E\(_{\mathrm{sad}}\):3, E\(_{\mathrm{d}}\):9, E\(_{\mathrm{sur}}\):-3 (see Appendix B). The six emotion variables (short as E.V.) are then (-7, 4, 7, 3, 9, -3). After a random number is generated for each emotion variable, the tuning scores obtained are (0, 2, 1, -1, 2, 1). Then the tuned E.V.=(-7, 6, 8, 2, 11, -2). After a random number is generated for each of fear and disgust, the fear is neutral and disgust is positive since according to Fig. 7 and Appendix D, 8 and 11 may be neutral or positive. At last, the primary emotion is disgust and positive-disgust is chosen. So O as shown in Fig. 15b is “You make me sick!”

Fig. 15
figure 15

Shiau_Lu’s primary emotion is neutral and S is “Do you want to eat worm ?” The new primary emotion is disgust and the response O is “You make me sick!”

Continuously, as shown in Fig. 16a, S is “You are a beauty.” to increase the degree of its happiness and reduces the degree of its disgust. Now the mood returns to neutral. O as shown in Fig. 16b is “Thanks.” In Fig. 17a, S is “Will you marry me?” which may make it feel happy. O as shown in Fig. 17b is “I know I am cute, but I am a robot.” In Fig. 18, S is “Let’s go skydiving.” which may increase the degree of its fear. At this moment, the mood approaches fear and happiness. In Figs. 19, 20, 21, the sentence “You look so beautiful.” is continuously input three times, consequently the mood changing from fear to happiness and then strong happiness.

Table 8 the emotion levels of happiness and fear.
Fig. 16
figure 16

S is “You are a beauty.” The new primary emotion returns to neutral and the response O is “Thanks”

Fig. 17
figure 17

S is “Will you marry me?” The new primary emotion is happiness and the response O is “I know I am cute, but I am a robot”

Fig. 18
figure 18

S is “Let’s go skydiving.” The new primary emotion is fear and the response O is “No!”

Fig. 19
figure 19

S = “You look so beautiful.” The new primary emotion is happiness and the response O is “ Thanks”

Fig. 20
figure 20

S = “You look so beautiful.” again. The new primary emotion is strong happiness and the response O is “Haha! I know.” Inface, “You always make me feel happy!” is also an answer. But due to randomly choosing, “Thanks.” is selected

Fig. 21
figure 21

S = “You look so beautiful. is iuput the third times. The new primary emotion is strong happiness and the response O is “You are beautiful in your mind.” Due to randomly choosing, this time “Haha! I know.” is not selected

Rights and permissions

Reprints and permissions

About this article

Cite this article

Leu, FY., Liu, Jc., Hsu, YT. et al. The simulation of an emotional robot implemented with fuzzy logic. Soft Comput 18, 1729–1743 (2014). https://doi.org/10.1007/s00500-013-1217-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-013-1217-1

Keywords

Navigation