Skip to main content

Advertisement

Log in

Development of a personified face emotion recognition technique using fitness function

  • ORIGINAL ARTICLE
  • Published:
Artificial Life and Robotics Aims and scope Submit manuscript

Abstract

In this article, two subjects, one South East Asian (SEA) and the other Japanese, are considered for face emotion recognition using a genetic algorithm (GA). The parameters relating the face emotions in each case are entirely different. All six universally accepted emotions and one neutral are considered for each subject. Eyes and lips are usually considered as the features for recognizing emotions. This paper has two parts. The first part investigates a set of image processing methods suitable for recognizing face emotion. The acquired images have gone through a few preprocessing methods such as gray-scale, histogram equalization, and filtering. The edge detection has to be sufficiently successful even when the light intensity is uneven. So, to overcome this problem, the histogram-equalized image has been split into two regions of interest (ROI): the eye and lip regions. The two regions have been applied with the same preprocessing methods, but with different threshold values. It was found that the Sobel edge detection method performed very well in segmenting the image. Three feature extraction methods are considered, and their respective performances are compared. The method which is fastest in extracting eye features is adopted. The second part of the paper discusses the way to recognize emotions from eye features alone. Observation of various emotions of the two subjects lead to an unique eye characteristic, that is, the eye exhibits ellipses of different parameters in each emotion. The GA is adopted to optimize the ellipse characteristics of the eye features in each emotion based on an ellipse-based fitness function. This has shown successful emotion classifications, and a comparison is made on the emotions of each subject.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Li H (2001) Computer recognition of human emotion. Proceedings of the International Symposium on Intelligent Multimedia, Video and Speech Processing, Hong Kong, pp 490–493

  2. Sebe N, Lew MS, et al. (2002) Emotion recognition using a cauchy naïve Bayes classifier. Proceedings of the 16 International Conference on Pattern Recognition, vol 1, pp 17–20

  3. RC Gonzalez RE Woods (2002) Digital image processing Pearson Education India

    Google Scholar 

  4. Yen GG, Nithianandan N (2002) Facial feature extraction using genetic algorithm. Proceedings of the Congress on Evolutionary Computation, vol 2, pp 1895–1900

  5. Karthigayan M, Rizon M, et al. (2006) An edge detection and feature extraction method suitable for face emotion detection under uneven lighting. Proceedings of the 2nd Regional Conference on Artificial Life and Robotics (AROB'06). July 14–15, 2006, Hatyai, Thailand

  6. Nagarajan R, Yaacob S, et al (2006) Real-time marking inspection scheme for semiconductor industries. Int J Adv Manuf Technol (IJAMT) accepted for publication

  7. M Negnevitsky (2002) Artificial intelligence. Addison Wesley Pearson Education Wakingham

    Google Scholar 

  8. http://neo.lcc.uma.es/TutorialEA/semEC/cap03/cap_3.html

  9. Tani H, Terada K, Oe S, et al. (2001) Detecting of one's eye from facial image by using genetic algorithm. Proceedings of the 27th Annual Conference of the IEEE Industrial Electronics Society, pp 1937–1940

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohd Rizon Mohamed Juhari.

About this article

Cite this article

Karthigayan, M., Juhari, M., Nagarajan, R. et al. Development of a personified face emotion recognition technique using fitness function. Artif Life Robotics 11, 197–203 (2007). https://doi.org/10.1007/s10015-007-0428-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10015-007-0428-x

Key words

Navigation