Abstract
We study the questions concerning the properties and capabilities of computational bithreshold real-weighted neural-like units. We give and justify the two sufficient conditions ensuring the possibility of separation of two sets in n-dimensional vector space by means of one bithreshold neuron. Our approach is based on application of convex and affine hulls of sets and is feasible in the case when one of the two sets is a compact and the second one is finite. We also correct and refine some previous results concerning bithreshold separability. Then the hardness of the learning bithreshold neurons is considered. We examine the complexity of the problem of checking whether the given Boolean function of n variables can be realizable by single bithreshold unit. Our main result is that the problem of verifying the bithreshold separability is NP-complete. The same is true for neural networks consisting of such computational units. We propose some continuous modifications of the bithreshold activation function to smooth away these difficulties and to make possible the application of modern paradigms and learning techniques for such networks.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Tkachenko R, Izonin I (2019) Model and principles for the implementation of neural-like structures based on geometric data transformations. Adv Intell Syst Comput 754:578–587
Bodyanskiy Y, Dolotov A, Peleshko D, Rashkevych Y, Vynokurova O (2019) Associative probabilistic Neuro-Fuzzy system for data classification under short training set conditions. Adv Intell Syst Comput 761:56–63
Geche F, Kotsovsky V, Batyuk A, Geche S, Vashkeba M (2015) Synthesis of time series forecasting scheme based on forecasting models system. In: CEUR workshop proceedings, Lviv, Ukraine, pp 121–136
Vynokurova O, Peleshko D, Borzov Y, Oskerko S, Voloshyn V (2018) Hybrid multidimensional wavelet-neuro-system and its learning using cross entropy cost function in pattern recognition. In: Proceedings of the 2018 IEEE 2nd international conference on data stream mining and processing, DSMP 2018, Lviv, Ukraine, pp 305–309
Bovdi V, Laver V (2019) Thelma, a package on threshold elements, Version 1.02. https://gap-packages.github.io/Thelma
Geche F (1999) The realization of Boolean functions by bithreshold neural units. Her Uzhhorod Univ Ser Math 4:17–25. (in Ukrainian)
Rastrigin LA (1981) Adaption of the complex systems: methods and applications. Zinatne, Riga. (in Russian)
Olafsson S, Abu-Mostafa YS (1988) The capacity of multilevel threshold function. IEEE Trans Pattern Anal Mach Intell 10(2):277–281
Takiyama R (1985) The separating capacity of multithreshold threshold element. IEEE Trans Pattern Anal Mach Intell PAMI-7(1):112–116
Geche F, Batyuk A, Kotsovsky V (2001) Properties of Boolean functions realizable by bithreshold units. Her Lviv Polytech Natl Univ Ser Comput Sci Inf Technol 438:22–25. (in Ukrainian)
Geche F, Kotsovsky V (2001) Representation of finite domain predicates using multithreshold neural elements. Her Uzhhorod Univ Ser Math Inform 6:32–37. (in Ukrainian)
Tsmots I, Medykovskyi M, Andriietskyi B, Skorokhoda O (2015) Architecture of neural network complex for forecasting and analysis of time series based on the neural network spectral analysis. In: Proceedings of 13th international conference: the experience of designing and application of CAD systems in microelectronics, CADSM 2015, Lviv-Polyana, Ukraine, pp 236–238
Deolalikar V (2002) A two-layer paradigm capable of forming arbitrary decision regions in input space. IEEE Trans Neural Netw 13(1):15–21
Kotsovsky V, Geche F, Batyuk A (2015) Artificial complex neurons with half-plane-like and angle-like activation function. In: Proceedings of the international conference on computer sciences and information technologies, CSIT 2015, Lviv, Ukraine, pp 57–59
Anthony M (2001) Discrete mathematics of neural networks: selected topics. SIAM, Philadelphia
Blum A, Rivest R (1992) Training a 3-node neural network is NP-Complete. Neural Netw 5(1):117–127
Kotsovsky V, Geche F, Batyuk A (2018) Finite generalization of the offline spectral learning. In: Proceedings of the 2018 IEEE 2nd international conference on data stream mining and processing, DSMP 2018, Lviv, Ukraine, pp 356–360
Geche F, Kotsovsky V, Batyuk A (2015) Synthesis of the integer neural elements. In: Proceedings of the international conference on computer sciences and information technologies, CSIT 2015, Lviv, Ukraine, pp. 63–66
Tsmots I, Teslyuk V, Teslyuk T, Ihnatyev I (2018) Basic components of neuronetworks with parallel vertical group data real-time processing. Adv Intell Syst Comput 689:558–576
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Kotsovsky, V., Geche, F., Batyuk, A. (2020). On the Computational Complexity of Learning Bithreshold Neural Units and Networks. In: Lytvynenko, V., Babichev, S., Wójcik, W., Vynokurova, O., Vyshemyrskaya, S., Radetskaya, S. (eds) Lecture Notes in Computational Intelligence and Decision Making. ISDMCI 2019. Advances in Intelligent Systems and Computing, vol 1020. Springer, Cham. https://doi.org/10.1007/978-3-030-26474-1_14
Download citation
DOI: https://doi.org/10.1007/978-3-030-26474-1_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-26473-4
Online ISBN: 978-3-030-26474-1
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)