Abstract
In this paper we focus on the problem of class discrimination issues to improve performance of text classification, and study a discrimination-based feature selection technique in which the features are selected based on the criterion of enlarging separation among competing classes, referred to as discrimination capability. The proposed approach discards features with small discrimination capability measured by Gaussian divergence, so as to enhance the robustness and the discrimination power of the text classification system. To evaluation its performance, some comparison experiments of multinomial naïve Bayes classifier model are constructed on Newsgroup and Ruters21578 data collection. Experimental results show that on Newsgroup data set divergence measure outperforms MI measure, and has slight better performance than DF measure, and outperforms both measures on Ruters21578 data set. It shows that discrimination-based feature selection method has good contributions to enhance discrimination power of text classification model.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Lewis, D., Schapire, R., Callan, J., Papka, R.: Training Algorithms for Linear Text Classifiers. In: Proceedings of ACM SIGIR, pp. 298–306 (1996)
Joachims, T.: Text categorization with Support Vector Machines: Learning with many relevant features. In: Nédellec, C., Rouveirol, C. (eds.) ECML 1998. LNCS, vol. 1398, pp. 137–142. Springer, Heidelberg (1998)
Lewis, D.: A Comparison of Two Learning Algorithms for Text Categorization. In: Symposium on Document Analysis and IR (1994)
Nigam, K., Lafferty, J., McCallum, A.: Using maximum entropy for text classification. In: IJCAI 1999 Workshop on Machine Learning for Information Filtering, pp. 61–67 (1999)
McCallum, Nigam, K.: A comparison of event models for naive bayes text classification. In: AAAI 1998 Workshop on Learning for Text Categorization (1998)
Yiming, Y., Pedersen, J.O.: A comparative study on feature selection in text categorization. In: 14th international conference on machine learning, pp. 412–420 (1997)
Jain, A., Zongker, D.: Feature selection: evaluation, application, and small sample performance. IEEE transactions on pattern analysis and machine intelligence 19(2), 153–158 (1997)
Su, K.Y., Lee, C.H.: Speech recognition using weighted HMM and subspace projection approach. IEEE transactions on speech and audio processing 2(1), 69–79 (1994)
Tol, J.T., Gonzalez, R.C.: Pattern recognition Principles. Addison-Wesley publishing company, Reading (1974)
Bressan, M., Vitria, J.: On the selection and classification of independent features. IEEE transactions on pattern analysis and machine intelligence 25(10), 1312–1317 (2003)
Schneider, K.-M.: A new feature selection score for multinomial naïve Bayes text classification based on KL-divergence. In: 42nd Annual meeting of the association for computational linguistics (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhu, J., Wang, H., Zhang, X. (2006). Discrimination-Based Feature Selection for Multinomial Naïve Bayes Text Classification. In: Matsumoto, Y., Sproat, R.W., Wong, KF., Zhang, M. (eds) Computer Processing of Oriental Languages. Beyond the Orient: The Research Challenges Ahead. ICCPOL 2006. Lecture Notes in Computer Science(), vol 4285. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11940098_15
Download citation
DOI: https://doi.org/10.1007/11940098_15
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-49667-0
Online ISBN: 978-3-540-49668-7
eBook Packages: Computer ScienceComputer Science (R0)