single-jc.php

JACIII Vol.12 No.3 pp. 304-313
doi: 10.20965/jaciii.2008.p0304
(2008)

Paper:

Fuzzy Inference based Mentality Expression for Eye Robot in Affinity Pleasure-Arousal Space

Yoichi Yamazaki, Yutaka Hatakeyama, Fangyan Dong,
Kohei Nomoto, and Kaoru Hirota

Tokyo Institute of Technology

Received:
September 19, 2007
Accepted:
February 15, 2008
Published:
May 20, 2008
Keywords:
robot, mentality expression, human interface
Abstract
A Mentality expression system based on the affinity pleasure-arousal space of eye robots is proposed for the communication between human beings and households robots. Mentality status in the proposed space is calculated by a fuzzy inference method that uses language category as the output of a speech understanding module. The constructed eye robot system expresses mentality in the affinity-pleasure-arousal space by predefined eyelid and ocular movements. Mentality expression experiments with two scenarios are performed. Since the results of an evaluation questionnaire show an average evaluation value of 3.6 out of 5.0, the proposed system is suitable for a communication architecture between interlocutor and robot. The system provides a basic interface for an information terminal in home environments.
Cite this article as:
Y. Yamazaki, Y. Hatakeyama, F. Dong, K. Nomoto, and K. Hirota, “Fuzzy Inference based Mentality Expression for Eye Robot in Affinity Pleasure-Arousal Space,” J. Adv. Comput. Intell. Intell. Inform., Vol.12 No.3, pp. 304-313, 2008.
Data files:
References
  1. [1] C. Breazeal, “Robot in Society: Friend or Appliance,” Agents99 workshop on emotion-based agent architectures, pp. 18-26, 1999.
  2. [2] M. Senda, H. Kobayashi, T. Shiba, and Y. Yamazaki, “Impression Evaluation of the realistic Face Robot,” In Proc. Human Interface Symposium, pp. 507-510, 2003.
  3. [3] H. Miwa, A. Takanishi, and H. Takanobu, “Experimental study on robot personality for humanoid head robot,” In Proc. IEEE/RSJ Int. Conf. Robots and Systems, pp. 1183-1188, 2001.
  4. [4] T. Fukuda et al., “Human-Robot Mutual Communication System,” IEEE Int. Workshop on Robot and Human Communication (ROMAN), pp. 14-19, 2001.
  5. [5] P. Ekman and W. V. Friesen, “The Facial Action Coding System,” Consulting Psychologists Press Inc., San Francisco, CA, 1978.
  6. [6] A. L. Yarbus, “Eye Movements and Vision,” New York Plenum, 1967.
  7. [7] H. Saito and A. Mori, “Visual Perception and Auditory Perception,” Ohmsha, Ltd, 1999.
  8. [8] R. A. Hinde, et al., “Non-verbal Communication,” Cambridge Univ. Press, 1972.
  9. [9] M. F. Vargas, “Louder than words,” Iowa State University Press, Japanese translation, pp. 15-17, pp. 78-107, 1987.
  10. [10] E. R. Kandel, J. H. Shcwarts, and T. M. Jessell, “Principle of Neural Science,” New York, Plenum Press, pp. 782-800, 2000.
  11. [11] H. Tada, F. Yamada, and K. Fukuda, “Psychology of blinking,” Kitaoji Publishing, pp. 158-206, 1991.
  12. [12] J. A. Russell, “Reading emotion from and into faces, The Psychology of Facial Expression,” New York, Cambridge University, pp. 295-320, 1997.
  13. [13] Y. Yamazaki, Y. Hatakeyama, Y. Masuda, F. Dong, and K. Hirota, “Allocation of Mentality Expression Motion for Eye Robot on Pleasure-Arousal Plane,” HIF-PF2007, 2007.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024