Abstract
The problem of weight initialization in multilayer perceptron networks is considered. A computationally simple weight initialization method based on the usage of reference patterns is investigated in channel equalization application. On one hand, the proposed method aims to set the initial weight values to be such that inputs to network nodes are within the active region. On ] the other hand, the goal is to distribute the discriminant functions formed by the hidden units evenly into the input space area where training data is located. The proposed weight initialization is tested in the channel equalization application where several alternatives for obtaining suitable reference patterns are investigated. A comparison with the conventional random initialization shows that significant improvement in convergence can be achieved with the proposed method. In addition, the computational cost of the initialization was found to be negligible compared with the cost of training.
Preview
Unable to display preview. Download preview PDF.
References
S. Haykin, Neural networks, a comprehensive foundation, Macmillan, New York, NY, 1994.
A. Blum and R. Rivest, “Training a 3-node neural network is NP-complete,” Proceedings of Computational Learning Theory, pp. 9–18, 1988.
S. Judd, “On the complexity of loading shallow neural networks,” Journal of Complexity, vol. 4, pp. 177–192, 1988.
G. Tesauro and Y. Ahmad, “Asymptotic convergence of backpropagation,” Neural Computation, vol. 1, no. 3, pp. 382–391, 1989.
T. Denoeux and R. Lengelle, “Initializing back propagation networks with prototypes,” Neural Networks, vol. 6, pp. 351–363, 1993.
L. Wessels and E. Barnard, “Avoiding false local minima by proper initialization of connections,” IEEE Transactions on Neural Networks, vol. 3, no. 6, pp. 899–905, 1992.
G. Drago and S. Ridella, “Statistically controlled activation weight initialization (SCAWI),” IEEE Transactions on Neural Networks, vol. 3, no. 4, pp. 627–631, 1992.
M. Lehtokangas, P Salmela, J. Saarinen and K. Kaski, “Weight initialization techniques in neural network systems and their application,” in C. Leondes (ed.), Algorithms and Architectures, vol. 1 in the Neural Network Systems Techniques and Applications Series, Academic Press, 1997.
Y. Kim and J. Ra, “Weight value initialization for improving training speed in the backpropagation network,” Proceedings of IEEE International Joint Conference on Neural Networks, pp. 2396–2401, 1991.
T Kaylani and S. Dasgupta, “Weight initialization of MLP classifiers using boundary-preserving patterns,” Proceedings of IEEE International Conference on Neural Networks, pp. 113–118,1994.
J. Proakis, Digital communications, McGraw-Hill, New York, 1995.
S. Chen, G. Gibson, C. Cowan and P. Grant, “Adaptive equalization of finite non-linear channels using multilayer perceptrons,” Signal Processing, vol. 20, no. 2, pp. 107–119, 1990.
G. Gibson, S. Sin and C. Cowan, “The application of nonlinear structures to the reconstruction of binary signals,” IEEE Transactions on Signal Processing, vol. 39, no. 8, pp. 1109–1118, 1991.
M. Mouly and M-B. Pautet, The GSM system for mobile communications, Palaiseau: Mouly & Pautet, 1992.
J. Ton and R. Gonzalez, Pattern recognition principles, Addison-Wesley, London, 1981.
M. Riedmiller and H. Braun, “A direct adaptive method for faster backpropagation learning: the RPROP algorithm,” Proceedings of IEEE International Conference on Neural Networks, pp. 586–591, 1993.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1998 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lehtokangas, M. (1998). Reference pattern weight initialization for equalization. In: Pasqual del Pobil, A., Mira, J., Ali, M. (eds) Tasks and Methods in Applied Artificial Intelligence. IEA/AIE 1998. Lecture Notes in Computer Science, vol 1416. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-64574-8_443
Download citation
DOI: https://doi.org/10.1007/3-540-64574-8_443
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-64574-0
Online ISBN: 978-3-540-69350-5
eBook Packages: Springer Book Archive