Abstract
We present here a variant of B. Fritzke’s GNG (Growing Neural Gas) — Double GNG (DGNG). In each insertion step of the GNG only one new cell is inserted in the middle of the edge connecting Maximum Resource Vertex (MRV) and a MRV in its direct topological neighbourhood. But in our DGNG two new cells are inserted at the same time. Our goal is to speed up the convergence of the learning process. Although multiple growing cell mechanism can reduce the required number of learning epochs, it can also lead to an increased network structure. Simulation results on some neural network benchmarks indicate that, for many data sets, when the number of new cells in each insertion step is within three, the total performance of networks is at best, but that increasing the number of new cells beyond three has very little benefit and sometimes degrades performance. In this paper we consider only the DGNG. With two disease diagnosis benchmarks (Wisconsin breast cancer and soybean disease diagnosis) we tested the DGNG and indicated that DGNG performs better than the original GNG, measured by the required number of epochs and CPU times.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
B. Fritzke. A growing neural gas network learns topologies. In G. Tesauro, D. S. Touretzky, and T. K. Leen, editors, Advances in Neural Information Processing Systems 7, pages 625–632. MIT Press, Cambridge MA, 1995.
B. Fritzke. Automatic construction of radial basis function networks with the growing neural gas model and its relevance for fuzzy logic. In Applied Computing 1996: Proceedings of the 1996 ACM Symposium on applied computing, pages 624–627, Philadelphia, 1996. ACM.
F. Hamker and D. Heinke. Implementation and comparison of growing neural gas, growing cell structures and fuzzy artmap. In Technical report, Schriftenreihe des FG Neuroinfor-matik der TU Ilmenau. Report 1/97, ISSN 0945-7518, 1997.
Teuvo Kohonen. Self-Organizing Maps. Springer, Berlin, Heidelberg, 1995. (Second Extended Edition 1997).
Thomas Martinetz. Competitive hebbian learning rule forms perfectly topology preserving maps. In Stan Gielen and Bert Kappen, editors, Proc. ICANN’93, Int. Conf on Artificial Neural Networks, pages 427–434, London, UK, 1993. Springer.
Lutz Prechelt. PROBEN1 — A set of benchmarks and benchmarking rules for neural network training algorithms. Technical report, Fakultät für Informatik, Universität Karlsruhe, Germany.
M. Kunze R. Berlich and J. Steffens. A comparison between the performance of feed forward neural networks and the supervised growing neural gas algorithm. In In 5th Artificial Intelligence in High Energy Physics workshop, Lausanne, 1996.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag London
About this paper
Cite this paper
Cheng, G., Zell, A. (2000). Double Growing Neural Gas for Disease Diagnosis. In: Malmgren, H., Borga, M., Niklasson, L. (eds) Artificial Neural Networks in Medicine and Biology. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-0513-8_47
Download citation
DOI: https://doi.org/10.1007/978-1-4471-0513-8_47
Publisher Name: Springer, London
Print ISBN: 978-1-85233-289-1
Online ISBN: 978-1-4471-0513-8
eBook Packages: Springer Book Archive