single-jc.php

JACIII Vol.19 No.1 pp. 134-142
doi: 10.20965/jaciii.2015.p0134
(2015)

Paper:

Bloch Sphere-Based Representation for Quantum Emotion Space

Fei Yan*1,*2, Abdullah M. Iliyasu*2,*3, Zhen-Tao Liu*4,
Ahmed S. Salama*3, Fangyan Dong*2, and Kaoru Hirota*2

*1School of Computer Science and Technology, Changchun University of Science and Technology, No. 7089, Weixing Road, Changchun 130022, China

*2Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, G3-49, 4259 Nagatsuta, Midori-ku, Yokohama 226-8502, Japan

*3College of Engineering, Salman Bin Abdulaziz University, Al-Kharj 11942, Kingdom of Saudi Arabia

*4School of Automation, China University of Geosciences, Wuhan 430074, China

Received:
March 8, 2014
Accepted:
November 13, 2014
Published:
January 20, 2015
Keywords:
quantum computation, emotion space, Bloch sphere, quantum emotion, visualization
Abstract
A Bloch Sphere-based Emotion Space (BSES), where two angles φ and θ in the Bloch sphere represent the emotion (such as happiness, surprise, anger, sadness, expectation, or relaxation in [0, 2π)) and its intensity (from neutral to maximum in [0, π]), respectively, is proposed. It exploits the psychological interpretation of color to assign a basic color to each emotion subspace such that the BSES can be visualized, and by using quantum gates, changes in emotions can be tracked and recovered. In an experimental validation, two typical human emotions, happiness and sadness, are analyzed and visualized using the BSES according to a preset emotional transmission model. A transition matrix that tracks emotional change can be used to control robots allowing them to adapt and respond to human emotions.
Cite this article as:
F. Yan, A. Iliyasu, Z. Liu, A. Salama, F. Dong, and K. Hirota, “Bloch Sphere-Based Representation for Quantum Emotion Space,” J. Adv. Comput. Intell. Intell. Inform., Vol.19 No.1, pp. 134-142, 2015.
Data files:
References
  1. [1] J. A. Russell, “Reading emotion from and into faces, The Psychology of Facial Expression,” Cambridge University Press, pp. 295-320, 1997.
  2. [2] H. Miwa, T. Umetsu, A. Takanishi, and H. Takanobu, “Robot personality based on the equations of emotion defined in the 3D mental space,” Proc. of the 2001 IEEE Int. Conf. on Robotics & Automation, pp. 2602-2607, 2001.
  3. [3] Y. Yamazaki, Y. Hatakeyama, F. Dong, K. Nomoto, and K. Hirota, “Fuzzy Inference based mentality expression for eye robot in affinity pleasure-arousal space,” J. of Advanced Computational Intelligence and Intelligent Informatics (JACIII), Vol.12, No.3, pp. 304-313, 2008.
  4. [4] R. Taki, Y. Maeda, and Y. Takahashi, “Effective emotional model of pet-type robot in interactive emotion communication,” Proc. of Joint 5th Int. Conf. on Soft Computing and Intelligent Systems and 11th Int. Symp. on Advanced Intelligent Systems, pp. 199-204, 2010.
  5. [5] Y. Maeda and N. Tanabe, “Basic study on interactive emotional communication by pet-type robot,” Trans. of the Society of Instrument Control Engineers, Vol.42, No.4, pp. 359-366, 2006.
  6. [6] K. Terada, A. Yamauchi, and A. Ito, “Artificial emotion expression for a robot by dynamic color change,” The 21st IEEE Int. Symposium on Robot and Human Interactive Communication, pp. 9-13, 2012.
  7. [7] A. Yamauchi, K. Terada, and A. Ito, “An emotion expression model for a robot by dynamic illuminated color change,” Human Interface, Vol.13, No.1, pp. 41-52, 2011.
  8. [8] M. A. Nielsen and I. L. Chung, “Quantum Computation and Quantum Information,” Cambridge University Press, 2000.
  9. [9] R. Plutchik, “Plutchik’s wheel of emotions,”1980,
    available online: http://www.fractal.org/Bewustzijns-Besturings-Model/Nature-ofemotions. htm [Accessed February 8, 2014]
  10. [10] A. Raghuvanshi and M. Perkowski, “Fuzzy quantum circuits to model emotional behaviors of humanoid robots,” 2010 IEEE Congress on Evolutionary Computation (CEC), pp. 1-8, 2010.
  11. [11] Eric, John, and Paraag, “Color psychology,” 2007,
    available online: http://library.thinkquest.org/27066/psychology/nlcolorpsych.html [Accessed May 9, 2013]
  12. [12] D. Johnson, “Color psychology,” 2007,
    available online: http://infoplease.com/spot/colors1.html [Accessed June 20, 2013]
  13. [13] P. Ekman, W. V. Friesen, and J. C. Hager, “Facial Action Coding System,” A Human Face, 2002.
  14. [14] A. M. Iliyasu, P. Q. Le, F. Dong, and K. Hirota, “A framework for representing and producing movies on quantum computers,” Int. J. of Quantum Information, Vol.9, No.6, pp. 1459-1497, 2011.
  15. [15] A. M. Iliyasu, P. Q. Le, F. Dong, and K. Hirota, “Watermarking and authentication of quantum images based on restricted geometric transformations,” Information Science, Vol.186, No.1, pp. 126-149, 2012.
  16. [16] P. Xia, “Quantum computing. Journal of computer research and development” Vol.38, No.10, pp. 1153-1171, 2001.
  17. [17] F. Yan, A. M. Iliyasu, P. Q. Le, B. Sun, F. Dong, and K. Hirota, “A parallel comparison of multiple pairs of images on quantum computers,” Int. J. of Innovative Computing and Applications, Vol.5, No.4, pp. 199-212, 2013.
  18. [18] F. Yan, A. M. Iliyasu, C. Fatichah, M. L. Tangel, J. P. Betancourt, F. Dong, and K. Hirota, “Quantum image searching based on probability distributions,” J. of Quantum Information Science, Vol.2, No.3, pp. 55-60, 2012.
  19. [19] Z. Liu, M. Wu, D. Li, L. Chen, F. Dong, Y. Yamazaki, and K. Hirota, “Concept of fuzzy atmosfield for representing communication atmosphere and its application to humans-robots interaction,” J. of Advanced Computational Intelligence and Intelligent Informatics (JACIII), Vol.17, No.1, pp. 3-17, 2013.
  20. [20] R. Blutner and E. Hochnadel, “Two Qubits for C.G. Jung’s Theory of Personality,” Cognitive Systems Research, Vol.11, No.3, pp. 243-259, 2010.
  21. [21] K. Grammer and E. Oberzaucher, “The reconstruction of facial expression in embodied systems new approaches to an old problem,” ZIF Mitteilungen, Vol.2, pp. 14-31, 2006.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 05, 2024