Abstract
The successful development of an application using the multilayer perceptron (MLP) model greatly depends on the structural complexity of the domains involved. Different mathematical and/or statistical techniques can be used to subtract the maximum amount of information of this type from an available sample of the input space. In the context of the MLP model, it has been used to decide on the form the parameters of the network and/or related learning algorithm (LA) should have. This paper describes the information subsumed in the Delaunay triangulation (DT) and Voronoi diagram (VD) of the points comprising the input space of an application, how it might be used to evaluate the convenience of building a network based on the MLP model for its implementation and to estimate an initial architecture that can be subsequently improved by a pruning process.
Preview
Unable to display preview. Download preview PDF.
References
Alpaydin, E.: GAL: Networks that grow when they learn and shrink when they forget. Technical Report TR-91-032: International Computer Science Institute, (1991).
Bose, N.K., Garga, A.K.: Neural network design using Voronoi diagrams. IEEE Transactions on Neural Networks, 4, 5, (1993) 778–787
Bowyer, A., Woodwark, J.: Introduction to computing with geometry. Winchester: Information Geometers, (1993)
Cybenko, G.: Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals, and Systems, 2, (1989), 303–314
Fahlman, S.: An empirical study of learning speed in backpropagation networks. Technical Report CMU-CS-88-162, Carnegie Mellon University, (1988)
Fritzke, B.: Growing cell structures — a self-organising network for unsupervised and supervised learning. Technical Report TR-93-026, International Computer Science Institute, (1993)
Hertz, J., Krogh, A., Palmer, R.: Introduction to the theory of neural computation. Addison-Wesley Pub., (1991)
Judd, P.: Neural networks and the complexity of learning: Cambridge MA: MIT Press, (1990)
Karnin, E.D.: A simple procedure for pruning back-propagation trained neural networks. IEEE Transactions on Neural Networks, 1, 2, (1990) 239–242
Okabe, A.,Boots, B., Sugihara, K.: Spatial tessellations: concepts and applications of Voronoi diagrams. Wiley series in Probability and Statistics, John Wiley & Sons, (1992)
Omohundro, S.M.: Geometric learning algorithms. Technical Report TR-89-041, International Computer Science Institute, (1989)
O'Rourke, J.: Computational geometry in C. Cambridge University Press, (1994)
Reidmiller, M.: Advanced supervised learning in multi-layer perceptrons, from backpropagation to adaptive algorithms. Int. J. of Computer Standards and Interfaces, Special Issue on Neural Networks, 5, (1994)
Romaniuk, S.G., Hall, L.O.: Divide and conquer neural networks. Neural Networks, 6, (1993), 1105–1116
Schiffmann, W., Joost, M., Wierner, R.: Optimisation of the backpropagation algorithm for training multilayer perceptrons. Neuroprose ftp site, (1992)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1996 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Pérez-Miñana, E., Ross, P., Hallam, J. (1996). Multi-layer perceptron design using Delaunay triangulations. In: Furuhashi, T., Uchikawa, Y. (eds) Fuzzy Logic, Neural Networks, and Evolutionary Computation. WWW 1995. Lecture Notes in Computer Science, vol 1152. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-61988-7_22
Download citation
DOI: https://doi.org/10.1007/3-540-61988-7_22
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-61988-8
Online ISBN: 978-3-540-49581-9
eBook Packages: Springer Book Archive