Skip to main content

On the Computational Complexity of Learning Bithreshold Neural Units and Networks

  • Conference paper
  • First Online:
Lecture Notes in Computational Intelligence and Decision Making (ISDMCI 2019)

Abstract

We study the questions concerning the properties and capabilities of computational bithreshold real-weighted neural-like units. We give and justify the two sufficient conditions ensuring the possibility of separation of two sets in n-dimensional vector space by means of one bithreshold neuron. Our approach is based on application of convex and affine hulls of sets and is feasible in the case when one of the two sets is a compact and the second one is finite. We also correct and refine some previous results concerning bithreshold separability. Then the hardness of the learning bithreshold neurons is considered. We examine the complexity of the problem of checking whether the given Boolean function of n variables can be realizable by single bithreshold unit. Our main result is that the problem of verifying the bithreshold separability is NP-complete. The same is true for neural networks consisting of such computational units. We propose some continuous modifications of the bithreshold activation function to smooth away these difficulties and to make possible the application of modern paradigms and learning techniques for such networks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Tkachenko R, Izonin I (2019) Model and principles for the implementation of neural-like structures based on geometric data transformations. Adv Intell Syst Comput 754:578–587

    Google Scholar 

  2. Bodyanskiy Y, Dolotov A, Peleshko D, Rashkevych Y, Vynokurova O (2019) Associative probabilistic Neuro-Fuzzy system for data classification under short training set conditions. Adv Intell Syst Comput 761:56–63

    Google Scholar 

  3. Geche F, Kotsovsky V, Batyuk A, Geche S, Vashkeba M (2015) Synthesis of time series forecasting scheme based on forecasting models system. In: CEUR workshop proceedings, Lviv, Ukraine, pp 121–136

    Google Scholar 

  4. Vynokurova O, Peleshko D, Borzov Y, Oskerko S, Voloshyn V (2018) Hybrid multidimensional wavelet-neuro-system and its learning using cross entropy cost function in pattern recognition. In: Proceedings of the 2018 IEEE 2nd international conference on data stream mining and processing, DSMP 2018, Lviv, Ukraine, pp 305–309

    Google Scholar 

  5. Bovdi V, Laver V (2019) Thelma, a package on threshold elements, Version 1.02. https://gap-packages.github.io/Thelma

  6. Geche F (1999) The realization of Boolean functions by bithreshold neural units. Her Uzhhorod Univ Ser Math 4:17–25. (in Ukrainian)

    Google Scholar 

  7. Rastrigin LA (1981) Adaption of the complex systems: methods and applications. Zinatne, Riga. (in Russian)

    Google Scholar 

  8. Olafsson S, Abu-Mostafa YS (1988) The capacity of multilevel threshold function. IEEE Trans Pattern Anal Mach Intell 10(2):277–281

    Article  Google Scholar 

  9. Takiyama R (1985) The separating capacity of multithreshold threshold element. IEEE Trans Pattern Anal Mach Intell PAMI-7(1):112–116

    Article  Google Scholar 

  10. Geche F, Batyuk A, Kotsovsky V (2001) Properties of Boolean functions realizable by bithreshold units. Her Lviv Polytech Natl Univ Ser Comput Sci Inf Technol 438:22–25. (in Ukrainian)

    Google Scholar 

  11. Geche F, Kotsovsky V (2001) Representation of finite domain predicates using multithreshold neural elements. Her Uzhhorod Univ Ser Math Inform 6:32–37. (in Ukrainian)

    Google Scholar 

  12. Tsmots I, Medykovskyi M, Andriietskyi B, Skorokhoda O (2015) Architecture of neural network complex for forecasting and analysis of time series based on the neural network spectral analysis. In: Proceedings of 13th international conference: the experience of designing and application of CAD systems in microelectronics, CADSM 2015, Lviv-Polyana, Ukraine, pp 236–238

    Google Scholar 

  13. Deolalikar V (2002) A two-layer paradigm capable of forming arbitrary decision regions in input space. IEEE Trans Neural Netw 13(1):15–21

    Article  Google Scholar 

  14. Kotsovsky V, Geche F, Batyuk A (2015) Artificial complex neurons with half-plane-like and angle-like activation function. In: Proceedings of the international conference on computer sciences and information technologies, CSIT 2015, Lviv, Ukraine, pp 57–59

    Google Scholar 

  15. Anthony M (2001) Discrete mathematics of neural networks: selected topics. SIAM, Philadelphia

    Book  Google Scholar 

  16. Blum A, Rivest R (1992) Training a 3-node neural network is NP-Complete. Neural Netw 5(1):117–127

    Article  Google Scholar 

  17. Kotsovsky V, Geche F, Batyuk A (2018) Finite generalization of the offline spectral learning. In: Proceedings of the 2018 IEEE 2nd international conference on data stream mining and processing, DSMP 2018, Lviv, Ukraine, pp 356–360

    Google Scholar 

  18. Geche F, Kotsovsky V, Batyuk A (2015) Synthesis of the integer neural elements. In: Proceedings of the international conference on computer sciences and information technologies, CSIT 2015, Lviv, Ukraine, pp. 63–66

    Google Scholar 

  19. Tsmots I, Teslyuk V, Teslyuk T, Ihnatyev I (2018) Basic components of neuronetworks with parallel vertical group data real-time processing. Adv Intell Syst Comput 689:558–576

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vladyslav Kotsovsky .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kotsovsky, V., Geche, F., Batyuk, A. (2020). On the Computational Complexity of Learning Bithreshold Neural Units and Networks. In: Lytvynenko, V., Babichev, S., Wójcik, W., Vynokurova, O., Vyshemyrskaya, S., Radetskaya, S. (eds) Lecture Notes in Computational Intelligence and Decision Making. ISDMCI 2019. Advances in Intelligent Systems and Computing, vol 1020. Springer, Cham. https://doi.org/10.1007/978-3-030-26474-1_14

Download citation

Publish with us

Policies and ethics