Abstract
A pixel-pattern-based texture feature (PPBTF) is proposed for real-time gender recognition. A gray-scale image is transformed into a pattern map where edges and lines are to be used for characterizing the texture information. On the basis of the pattern map, a feature vector is comprised the numbers of the pixels belonging to each pattern. We use the image basis functions obtained by principal component analysis (PCA) as the templates for pattern matching. The characteristics of the feature are comprehensively analyzed through an application to gender recognition. Adaboost is used to select the most discriminative feature subset, and support vector machine (SVMs) is adopted for classification. Performed on frontal images from FERET database, the comparisons with Gabor show that PPBTF is a significant facial representation, quite effective and speedier in computation.









Similar content being viewed by others
References
Golomb B., Lawrence D., Sejnowski T.: Sexnet: a neural network identifies sex from human faces. In: Advances in Neural Information Processing Systems, pp. 572–577. Morgan Kaufmann, San Mateo (1991)
Tamura S.H., Kawai, Mitsumoto H.: Male/female identification from 8 × 6 very low resolution face images by neural network. Pattern Recognit. 29, 331–335 (1996)
Gutta, S., Weschler, H., Phillips, P.J.: Gender and ethnic classification of human faces using hybrid classifiers. In: Proceedings of IEEE International Conference on Automatic Face and Gesture Recognition, pp. 194–199 (1998)
Wu B., Ai H., Huang C.: LUT-based Adaboost for gender classification. AVBPA 2688, 104–110 (2003)
Moghaddam B., Yang M.H.: Gender classification with support vector machines. In: IEEE Trans. PAMI 24, 707–711 (2002)
Baluja S., Rowley H.A.: Boosting sex identification performance. Int. J. Comput. Vis. 71, 111–119 (2007)
Jain, A., Huang, J., Fang, S.: Gender identification using frontal facial images. In: Proceedings of IEEE Conference on Multimedia and Expo, pp. 1082–1085 (2005)
Burton A., Bruce V., Dench N.: What’s the difference between men and women? Evidence from facial measurements. Perception 22, 153–176 (1993)
Brunelli, R., Poggio, T.: Hyberbf networks for gender classification. In: DARPA Image understanding Workshop, pp. 311–314 (1992)
Sun, Z., Bebis, G., Yuan, X., Louis, S.J.: Genetic feature subset selection for gender classification: a comparison study. In: 6th IEEE Workshop on Applications of Computer Vision, pp. 165–170. IEEE Computer Society, Orlando (2002)
Walavalkar L., Yeasin M., Arasinmhmurthy A., Sharma R.: Support vector learning for gender classification using audio visual cues: a comparison. Pattern Recognit Artif Intell. 17, 417–439 (2003)
Zeng X.Y., Chen Y.W., Nakao Z., Lu H.: Texture representations based on pattern maps. Signal Process. 84, 589–599 (2004)
Tian, Y.: Evaluation of face resolution for expression analysis. In: IEEE Workshop on Face Processing in Video (2004)
Hancock P.J.B., Baddeley R.J., Smith L.S.: The principal components of natural images. Network 3, 61–70 (1992)
Bell A.J., Sejnowski T.J.: The ‘independent components’of natural scenes are edge filters. Vision Res. 37, 3327–3338 (1997)
Turk M., Pentland A.: Eigenfaces for recognition. J. Cogn. Neurosci. 3, 71–86 (1991)
Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Proceedings of IEEE conference on Computer Vision and Pattern Recognition, pp. 511–518 (2001)
Littlewort G., Bartlett M.S., Fasel I., Movellan J.R.: Real time face detection and facial expression recognition: development and applications to human computer interaction. CVPRW 5, 53 (2003)
Kanade, T., Cohn, J.F., Tian, Y.: Comprehensive database for facial expression analysis. In: Proceedings of the 4th IEEE International Conference on Automatic Face and Gesture Recognition, pp. 46–53 (2000)
Lu, H., Wu, P., Lin, H., Yang, D.: Automatic facial expression recognition. In: IEEE International Symposium on Neural Network, pp. 63–68 (2006)
Sim, T., Baker, S., Bsat, M.: The CMU pose, illumination, and expression (PIE) database of human faces. In: Report CMU-RI-TR-01–02, Robotics Institute, Carnegie Mellon University (2001)
Acknowledgments
Portions of the research in this article use the FERET database of facial images collected under the FERET program.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Lu, H., Huang, Y., Chen, Y. et al. Automatic gender recognition based on pixel-pattern-based texture feature. J Real-Time Image Proc 3, 109–116 (2008). https://doi.org/10.1007/s11554-008-0072-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11554-008-0072-2