Abstract
This paper discusses a research on biometrics information recognition in real time for expressing needs of the people with communication disabilities. They have facial expression ability and incomprehensible speech that can be interpreted to communicate their needs. We utilized the real time detected face features in the pattern recognition process. Their facial expressions for a specific need may not be identical but have some similarities that can be identified through pattern recognition and trained using artificial intelligence. EmoCom has achieved a recognition rate of 85% in both cluttered indoor and outdoor environment.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
AT&T Face Database (The ORL Face Database) http://www.cl.cam.ac.uk/Research/DTG/attarchive/pub/data/att_faces.zip (Accessed on December 31, 2008)
Boumbarov, O., Sokolov, S., Gluhchev, G.: Combined face recognition using wavelet packets and radial basis function neural network. In: CompSysTech 2007: Proc of International conference on Computer systems and technologies. ACM, New York (2007)
Datcu, D., Rothkrantz, L.: Facial expression recognition in still pictures and videos using Active Appearance Model. In: A comparison approach. ACM, New York (2007)
Ekman, P.: Emotions Revealed. Henry Holt and Company LLC, First Owl Books, New York (2004)
Ekman, P.: Facial Expressions. In: Dalgleish, T., Powers, M. (eds.) Handbook of Cognition and Emotion. John Wiley & Sons Ltd., Chichester (1999)
Geetha, A., Ramalingam, V., Palanivel, S., Palaniappan, B.: Facial expression recognition – A real time approach. Expert Systems with Applications 36(1), 303–308 (2009)
Heisele, B., Ho, P., Wu, J., Poggio, T.: Face recognition: component based versus global approaches. Computer Vision and Image Understanding 91(1/2), 6–21 (2003)
Huang, X., Lin, Y.: A vision-based hybrid method for facial expression recognition. In: Proceedings of the 1st international conference on ambient media and systems (2008)
Japanese Female Facial Expression (JFFE) Database, http://www.kasrl.org/jaffe_download.html (Accessed on December 31, 2008)
Lau, B.T.: Gabor Neural Network based Facial Expression Recognition for Assistive Speech Expression. In: Koeppen, M., Kasabov, N., Coghill, G. (eds.) Proceedings of 15th International Conference on Neural Information Processing of the Asia-Pacific Neural Network Assembly, Auckland, New Zealand. LNCS, vol. 5506–5507, Springer, Heidelberg (2008)
Lau, B.T., Low, T.K., Hii, K.S.: A Mobile Real Time Interactive Communication Assistant for Cerebral Palsy. International Journal of Computer and ICT Research 3(2), 1818–1839 (2009)
Raquel, A.R.: Real-time Face Verification. Thesis (M.S.) Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science (1995)
Tistarelli, M., Bicego, M., Grosso, E.: Dynamic face recognition: From human to machine vision. Image and Vision Computing 27(3), 222–232 (2009)
Weyrauch, B., Huang, J.: Component-based Face Recognition with 3D Morphable Models. In: Proceedings of 4th Conference on Audio- and Video-Based Biometric Person Authentication, pp. 27–34 (2003)
Wong, J.J., Cho, S.Y.: A local expert organization model with application to face emotion recognition. Expert Systems with Applications 36(1), 804–819 (2009)
Yang, P., Liu, Q., Metaxas, D.N.: Boosting encoded dynamic features for facial expression recognition. Pattern Recognition Letters 30(2), 132–139 (2009)
Zana, Y., Roberto, M.C.: Face recognition based on polar frequency features. ACM Transactions on Applied Perception (TAP)Â 3(1) (2007)
Zhao, W., Chellappa, R., Phillips, P.J., Rosenfeld, A.: Face recognition: A literature survey. ACM Computing Surveys (CSUR)Â 35(4) (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Theng, L.B. (2009). Portable Real Time Needs Expression for People with Communication Disabilities. In: Tavangarian, D., Kirste, T., Timmermann, D., Lucke, U., Versick, D. (eds) Intelligent Interactive Assistance and Mobile Multimedia Computing. IMC 2009. Communications in Computer and Information Science, vol 53. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10263-9_8
Download citation
DOI: https://doi.org/10.1007/978-3-642-10263-9_8
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-10262-2
Online ISBN: 978-3-642-10263-9
eBook Packages: Computer ScienceComputer Science (R0)