Abstract
Neural networks based on metric recognition methods allow, based on the initial conditions of the computer vision task such as the number of images and samples, to determine the structure of the neural network (the number of neurons, layers, connections), and also allow to analytically calculate the values of the weights on the connections of the neural network. As feedforward neural networks, they can also be trained by classical learning algorithms. The possibility of precomputation of the values of the neural network weights allows us to say that the procedure for creating and training a feedforward neural network is accelerated in comparison with the classical scheme for creating and training a neural network where values of the weights are randomly generated. In this work, we conduct two experiments based on the handwritten numbers dataset MNIST that confirm this statement.













Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Change history
10 February 2021
An Erratum to this paper has been published: https://doi.org/10.1134/S0005117920120103
References
Kruglov, V. V. & Borisov, V. V. Iskusstvennye neironnye seti. Teoriya i praktika (Artificial Neural Networks: Theory and Practice). (Goryachaya Liniya-Telekom, Moscow, 2001).
Wassermann, P. D. Neural Computing. Theory and Practice. (Van Nostrand, New York, 1989). Translated under the title Neirokompayuternaya tekhnika. Teoriya i praktika, Moscow: Mir, 1992.
Geidarov, P. Sh Neural Networks on the Basis of the Sample Method. Automat. Control Comput. Sci. 43(no. 4), 203–210 (2009).
Geidarov, P. Sh Multitasking Application of Neural Networks Implementing Metric Methods of Recognition. Autom. Remote Control 74(no. 9), 1474–1485 (2013).
Birger, I. A. Tekhnicheskaya diagnostika (Technical Diagnostics). (Mashinostroenie, Moscow, 1978).
LeCun, Y., Bengio, Y. & Hinton, G. Deep Learning. Nature no. 521, 436–444 (2015).
Schmidhuber, J. Deep Learning in Neural Networks: An overview. Neural Networks no. 61, 85–117 (2015).
Srivastava, N. et al. A Simple Way to Prevent Neural Networks from Overfitting. J. Mach. Learn. Res. 15(no. 1), 1929–1958 (2014).
Geidarov, P. Sh Clearly Defined Architectures of Neural Networks and Multilayer Perceptron. Opt. Mem. Neural Network 26, 62–76 (2017).
Geidarov, P. Sh An Algorithm for Nearest Neighbor Method Implementation in a Multilayer Perceptron. Tr. SPIIRAN no. 51, 123–151 (2017).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Geidarov, P. Comparative Analysis of the Results of Training a Neural Network with Calculated Weights and with Random Generation of the Weights. Autom Remote Control 81, 1211–1229 (2020). https://doi.org/10.1134/S0005117920070048
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S0005117920070048