Skip to main content

K-Separability

  • Conference paper
Artificial Neural Networks – ICANN 2006 (ICANN 2006)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4131))

Included in the following conference series:

Abstract

Neural networks use their hidden layers to transform input data into linearly separable data clusters, with a linear or a perceptron type output layer making the final projection on the line perpendicular to the discriminating hyperplane. For complex data with multimodal distributions this transformation is difficult to learn. Projection on k ≥2 line segments is the simplest extension of linear separability, defining much easier goal for the learning process. The difficulty of learning non-linear data distributions is shifted to separation of line intervals, making the main part of the transformation much simpler. For classification of difficult Boolean problems, such as the parity problem, linear projection combined with k-separability is sufficient.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Duda, R.O., Hart, P.E., Stork, D.: Patter Classification. J. Wiley & Sons, New York (2001)

    Google Scholar 

  2. Duch, W.: Similarity based methods: a general framework for classification, approximation and association. Control and Cybernetics 29, 937–968 (2000)

    MATH  MathSciNet  Google Scholar 

  3. Duch, W., Adamczak, R., Diercksen, G.: Classification, association and pattern completion using neural similarity based methods. Applied Math. & Comp. Science 10, 101–120 (2000)

    Google Scholar 

  4. Witten, I., Frank, E.: Data Mining: Practical machine learning tools and techniques, 2nd edn. Morgan Kaufmann, San Francisco (2005)

    MATH  Google Scholar 

  5. Jankowski, N., Gra̧bczewski, K., Duch, W., Naud, A.R.: Ghostminer data mining software. Technical report (2000-2005), http://www.fqspl.com.pl/ghostminer/

  6. Stork, D., Allen, J.: How to solve the n-bit parity problem with two hidden units. Neural Networks 5, 923–926 (1992)

    Article  Google Scholar 

  7. Minor, J.: Parity with two layer feedforward nets. Neural Networks 6, 705–707 (1993)

    Article  Google Scholar 

  8. Setiono, R.: On the solution of the parity problem by a single hidden layer feedforward neural network. Neurocomputing 16, 225–235 (1997)

    Article  Google Scholar 

  9. Lavretsky, E.: On the exact solution of the parity-n problem using ordered neural networks. Neural Networks 13, 643–649 (2000)

    Article  Google Scholar 

  10. Arslanov, M., Ashigaliev, D., Ismail, E.: N-bit parity ordered neural networks. Neurocomputing 48, 1053–1056 (2002)

    Article  MATH  Google Scholar 

  11. Liu, D., Hohil, M., Smith, S.: N-bit parity neural networks: new solutions based on linear programming. Neurocomputing 48, 477–488 (2002)

    Article  MATH  Google Scholar 

  12. Torres-Moreno, J., Aguilar, J., Gordon, M.: The minimum number of errors in the n-parity and its solution with an incremental neural network. Neural Proc. Letters 16, 201–210 (2002)

    Article  MATH  Google Scholar 

  13. Iyoda, E., Nobuhara, H., Hirota, K.: A solution for the n-bit parity problem using a single translated multiplicative neuron. Neural Processing Letters 18, 233–238 (2003)

    Article  Google Scholar 

  14. Wilamowski, B., Hunter, D.: Solving parity-n problems with feedforward neural network. In: Int. Joint Conf. on Neural Networks (IJCNN 2003), Portland, Oregon, vol. I, pp. 2546–2551 (2003)

    Google Scholar 

  15. Duch, W.: Visualization of hidden node activity in neural networks: I. Visualization methods. II. Application to RBF networks. In: Rutkowski, L., Siekmann, J.H., Tadeusiewicz, R., Zadeh, L.A. (eds.) ICAISC 2004. LNCS (LNAI), vol. 3070, pp. 38–49. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  16. Duch, W.: Coloring black boxes: visualization of neural network decisions. In: Int. Joint Conf. on Neural Networks, Portland, Oregon, vol. I, pp. 1735–1740. IEEE Press, Los Alamitos (2003)

    Google Scholar 

  17. Duch, W., Jankowski, N.: Survey of neural transfer functions. Neural Computing Surveys 2, 163–213 (1999)

    Google Scholar 

  18. Duch, W., Jankowski, N.: Taxonomy of neural transfer functions. In: International Joint Conference on Neural Networks, Como, Italy, vol. III, pp. 477–484. IEEE Press, Los Alamitos (2000)

    Google Scholar 

  19. Dietterich, T., Bakiri, G.: Solving multiclass learning problems via error-correcting output codes. Journal Of Artificial Intelligence Research 2, 263–286 (1995)

    MATH  Google Scholar 

  20. Zuyev, Y.: Asymptotics of the logarithm of the number of threshold functions of the algebra of logic. Soviet Mathematics Doklady 39 (1989)

    Google Scholar 

  21. Duch, W., Adamczak, R., Gra̧bczewski, K.: A new methodology of extraction, optimization and application of crisp and fuzzy logical rules. IEEE Transactions on Neural Networks 12, 277–306 (2001)

    Article  Google Scholar 

  22. Ngom, A., Stojmenovic, I., Zunic, J.: On the Number of Multilinear Partitions and the Computing Capacity of Multiple-Valued Multiple-Threshold Perceptrons. IEEE Transactions on Neural Networks 14, 469–477 (2003)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Duch, W. (2006). K-Separability. In: Kollias, S.D., Stafylopatis, A., Duch, W., Oja, E. (eds) Artificial Neural Networks – ICANN 2006. ICANN 2006. Lecture Notes in Computer Science, vol 4131. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11840817_20

Download citation

  • DOI: https://doi.org/10.1007/11840817_20

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-38625-4

  • Online ISBN: 978-3-540-38627-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics