Skip to main content

Feature Selection Using Distance from Classification Boundary and Monte Carlo Simulation

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2018)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11304))

Included in the following conference series:

Abstract

In binary classification, to improve the performance for unknown samples, excluding as many unnecessary features representing samples as possible is necessary. Of various methods of feature selection, the filter method calculates indices beforehand for each feature, and the wrapper method finds combinations of features having the maximum performance from all combinations of features. In this paper, we propose a novel feature selection method using distance from the classification boundary and a Monte Carlo simulation. Synthetic sample sets for binary classification were provided, and features determined by random numbers were added to each sample. For these sample sets, the conventional methods and the proposed method were applied, and it was examined whether the feature forming the boundary was selected. Our results demonstrate that feature selection was difficult with the conventional methods but possible with our proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. John, G.H., Kohavi, R., Pfleger, K.: Irrelevant features and the subset selection problem (1994)

    Chapter  Google Scholar 

  2. Almuallim, H., Dietterich, T.G.: Learning with many irrelevant features. In: AAAI, vol. 91 (1991)

    Google Scholar 

  3. Blum, A.L., Langley, P.: Selection of relevant features and examples in machine learning. Artif. Intell. 97, 245–271 (1998)

    Article  MathSciNet  Google Scholar 

  4. Chandrashekar, G., Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. 40, 16–28 (2014)

    Article  Google Scholar 

  5. Vergara, J.R., Estévez, P.A.: A review of feature selection methods based on mutual information. Neural Comput. Appl. 24, 175–186 (2015)

    Article  Google Scholar 

  6. Li, Y., Li, T., Liu, H.: Recent advances in feature selection and its applications. Knowl. Inf. Syst. 53, 551–577 (2017)

    Article  Google Scholar 

  7. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)

    MATH  Google Scholar 

  8. Jain, A.K., Duin, R.P.W., Mao, J.: Statistical pattern recognition: a review. IEEE Trans. Pattern Anal. 22, 4–37 (2000)

    Article  Google Scholar 

  9. Shannon, C.: A mathematical theory of communication. ACM SIGMOBILE Mobile Comput. Commun. Rev. 5, 3–55 (2001)

    Article  Google Scholar 

  10. Bennasar, M., Hicks, Y., Setchi, R.: Feature selection using joint mutual information maximisation. Expert Syst. Appl. 42, 8520–8532 (2015)

    Article  Google Scholar 

  11. Zhao, G., Wu, Y., Chen, F., Zhang, J., Bai, J.: Effective feature selection using feature vector graph for classification. Neurocomputing 151, 376–389 (2015)

    Article  Google Scholar 

  12. Pes, B., Dessì, N., Angioni, M.: Exploiting the ensemble paradigm for stable feature selection: a case study on high-dimensional genomic data. Inf. Fusion 35, 132–147 (2017)

    Article  Google Scholar 

  13. Sánchez-Maroño, N., Alonso-Betanzos, A., Tombilla-Sanromán, M.: Filter methods for feature selection – a comparative study. In: Yin, H., Tino, P., Corchado, E., Byrne, W., Yao, X. (eds.) IDEAL 2007. LNCS, vol. 4881, pp. 178–187. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-77226-2_19

    Chapter  Google Scholar 

  14. Mitchell, T.M.: Machine Learning, vol. 45. McGraw Hill, Burr Ridge (1997)

    MATH  Google Scholar 

  15. Quinlan, J.: Induction of decision trees. Mach. Learn. 1, 81–106 (1986)

    Google Scholar 

  16. Kira, K., Rendell, L.A.: The feature selection problem: traditional methods and a new algorithm. In: AAAI, vol. 2 (1992)

    Google Scholar 

  17. Kononenko, I.: Estimating attributes: analysis and extensions of RELIEF. In: Bergadano, F., De Raedt, L. (eds.) ECML 1994. LNCS, vol. 784, pp. 171–182. Springer, Heidelberg (1994). https://doi.org/10.1007/3-540-57868-4_57

    Chapter  Google Scholar 

  18. Liu, H., Motoda, H., Yu, L.: Feature selection with selective sampling. In: ICML (2002)

    Google Scholar 

  19. Kira, K., Rendell, L.A.: A practical approach to feature selection (1992)

    Google Scholar 

  20. Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97, 273–324 (1997)

    Article  Google Scholar 

  21. Panthong, R., Srivihok, A.: Wrapper feature subset selection for dimension reduction based on ensemble learning algorithm. Procedia Comput. Sci. 72, 162–169 (2015)

    Article  Google Scholar 

  22. Mi, H., Petitjean, C., Dubray, B., Vera, P., Ruan, S.: Robust feature selection to predict tumor treatment outcome. Artif. Intell. Med. 64, 195–204 (2015)

    Article  Google Scholar 

  23. Vapnik, V.: Pattern recognition using generalized portrait method. Autom. Remote Control. 24, 774–780 (1963)

    Google Scholar 

  24. Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers, pp. 144–152 (1992)

    Google Scholar 

  25. Aizerman, M.A.: Theoretical foundations of the potential function method in pattern recognition learning. Autom. Remote Control 25, 821–837 (1964)

    MATH  Google Scholar 

  26. Buhmann, M.D.: Radial Basis Functions: Theory and Implementations, vol. 12. Cambridge University Press, Cambridge (2003)

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuichi Sakumura .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Koyama, Y., Ikeda, K., Sakumura, Y. (2018). Feature Selection Using Distance from Classification Boundary and Monte Carlo Simulation. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11304. Springer, Cham. https://doi.org/10.1007/978-3-030-04212-7_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-04212-7_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-04211-0

  • Online ISBN: 978-3-030-04212-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics