Abstract
Feature selection is an important task in machine learning. In this work, we design a robust algorithm for optimal feature subset selection. We present a global optimization technique for feature weighting. Margin induced loss functions are introduced to evaluate features, and we employs linear programming to search the optimal solution. The derived weights are combined with the nearest neighbor rule. The proposed technique is tested on UCI data sets. Compared with Simba and LMFW, the proposed technique is effective and efficient.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Gilad-Bachrach, R., Navot, A., Tishby, N.: Margin based feature selection–theory and algorithms. In: Proceedings of the 21st International Conference on Machine Learning, p. 40 (2004)
Chen, M., Ebert, D., Hagen, H., Laramee, R.S.: Data Information and Knowledge in Visualization. Computer Graphics and Applications, 12–19 (2009)
Liu, C., Jaeger, S., Nakagawa, M.: Offline Recognition of Chinese Characters: the State of Art. IEEE Transcation on Pattern Analysis and Machine Intelligence 2, 198–213 (2004)
Saeys, Y., Inza, I., Larranaga, P.: A review of feature selection techniques in bioinformatics. Bioinformatics 19, 2507–2517 (2007)
Liu, H., Yu, L.: Toward integrating feature selection algorithms for classification and clustering. IEEE Transactions on Knowledge and Data Engineering 17, 494–502 (2005)
Kohavi, R., John, G.: Wrapper for feature subset selection. Artifical Intelligence, 234–273 (1997)
Pal, M.: Margin-based feature selection for hyperspectral data. International Journal of Applied Earth Observation and Geoinformation 11, 212–220 (2009)
Peng, H., Long, F., Ding, C.: Feature Selection Based on Mutual Information: Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence 8, 1226–1236 (2005)
Huang, D., Chow, T.W.S.: Effective feature selection scheme using mutual information. Neurocomputing 63, 325–343 (2005)
Liu, H., Sun, J., Liu, L., Zhang, H.: Feature selection with dynamic mutual information. Pattern Recognition 42, 1330–1339 (2009)
Li, Y., Lu, B.-L.: Feature selection based on loss-margin of nearest neighbor classification. Pattern Recognition 42, 1914–1921 (2009)
Kononenko, I.: Estimating Attributes: Analysis and Extensions of RELIEF. In: Bergadano, F., De Raedt, L. (eds.) ECML 1994. LNCS, vol. 784, pp. 171–182. Springer, Heidelberg (1994)
Sun, Y.: Iterative RELIEF for Feature Weighting: Algorithms,Theories, and Applications. IEEE Transations on Pattern Analysis and Machine Intelligence 6, 1–17 (2007)
Weinberger, K.Q., Blitzer, J., Saul, L.K.: Distance Metric Learning for Large Margin Nearest Neighbor Classification. Journal of Machine Learning Research, 207–244 (2009)
Chen, B., Liu, H., Chai, J., Bao, Z.: Large Margin Feature Weighting Method via Linear Programming. IEEE Transactions on Knowledge and Data Engineering 10, 1475–1486 (2009)
Merz, C.J., Merphy, P.: UCI repository of machine learning databases [OB/OL] (1996), http://www.ics.uci.edu/~mlearn/MLRRepository.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Pan, W., Ma, P., Su, X. (2012). Feature Weighting Algorithm Based on Margin and Linear Programming. In: Yao, J., et al. Rough Sets and Current Trends in Computing. RSCTC 2012. Lecture Notes in Computer Science(), vol 7413. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32115-3_46
Download citation
DOI: https://doi.org/10.1007/978-3-642-32115-3_46
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-32114-6
Online ISBN: 978-3-642-32115-3
eBook Packages: Computer ScienceComputer Science (R0)