Abstract
The sphere-packing problem is the task of finding an arrangement to achieve the maximum density of identical spheres in a given space. This problem arises in the placement of kernel functions for uniform input space quantisation in machine learning algorithms. One example is the Cerebellar Model Articulation Controller (CMAC), where the problem arises as the placement of overlapping grids. In such situations, it is desirable to achieve a uniform placement of grid vertices in input space. This is akin to the sphere-packing problem, where the grid vertices are the centres of spheres. The nature of space quantisation inherent in such algorithms imposes constraints on the solution and usually requires a regular tessellation of spheres. The sphere-packing problem is difficult to solve analytically, especially with these constraints. The current approach in the case of CMAC-based methods is to rely on published tables of grid spacings, but this has two shortcomings. First, no analytical solution has been published for the calculation of such tables - they were arrived at by exhaustive search. Second, the tables include input spaces of only ten dimensions or less. Many data mining problems now rely upon machine learning techniques to solve problems in higher dimensional spaces. A new approach to obtaining suitable grid spacings, based on a Genetic Algorithm, is described, which is potentially faster than exhaustive search. The resulting grid spacings are very similar to the published tables, and empirical trials show that where they differ, the performance on an automated classifier problem is unchanged. The new approach is also feasible for more than ten dimensions, and tables are presented for grid spacings in higher dimensional spaces. The results are applicable to any application where a regular division of input space is required. They allow the investigation of space quantising algorithms for solving problems in high dimensional spaces.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Albus, J. S. (1975). A New Approach to Manipulator Control: the Cerebellar Model Articulation Controller (CMAC). Trans. ASME, Series G. Journal of Dynamic Systems, Measurement and Control. 97, 220–233.
Albus, J.S. (1979). Mechanisms of Planning and Problem Solving in the Brain. Mathematical Biosciences, 45, 247–293.
Brown, M., Harris, C.J. and Parks, P.C. (1993). The Interpolation Capabilities of the Binary Cmac. Neural Networks 6, 3: 429–440.
Conway, J. H. and Sloane, N. J. A. Sphere Packings, Lattices, and Groups, 2nd ed. New York: Springer-Verlag, 1993.
Cornforth, D., and Elliman, D. (1993). Modelling probability density functions for classifying using a CMAC, in Techniques and Applications of Neural Networks. Taylor, M., and Lisboa, P. Ellis Horwood
Cornforth, D. and Newth, D. (2001). The Kernel Addition Training Algorithm: Faster Training for CMAC Based Neural Networks. Proc. Conf. Artificial Neural Networks and Expert Systems, Otago.
Geng, Z.J., and Shen, W. (1997). Fingerprint Classification Using Fuzzy Cerebellar Model Arithmetic Computer Neural Networks. Journal of Electronic Imaging, 6(3), 311–318.
Goldberg, D.: Genetic Algorithms in Search, Optimisation and Machine Learning. Addison Wesley (1989).
Gruber, P. M. and Lekkerkerker, C. G. Geometry of Numbers. Amsterdam, Netherlands: North-Holland, 1987.
Han J. and Kamber M. (2001). Data Mining Concepts and Techniques. Morgan Kaufman.
J. He and X. Yao, (2001) “Drift Analysis and Average Time Complexity of Evolutionary Algorithms,” Artificial Intelligence, 127(1):57–85.
Hilbert, D. and Cohn-Vossen, S. Geometry and the Imagination. New York: Chelsea, p. 47, 1999.
Holland, J. (1992). Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. MIT Press, second edition.
Kolcz, A. and Allinson, N.M. “The General Memory Neural Network and Its Relationship with Basis Function Architectures.” Neurocomputing 29 (1999): 57–84.
Parks, P.C. and Militzer, J. (1991). Improved Allocation of Weights for Associative Memory Storage in Learning Control Systems. IFAC Design Methods of Control Systems, Zurich, Switzerland, 507–512.
Powell, M.J.D.: The Theory of Radial Basis Functions in 1990. In: Light, W.A. (ed.), Advances in Numerical Analysis Volume II: Wavelets, Subdivision Algorithms and Radial Basis Functions, Oxford University Press, 1992, pp. 105–210.
Santamaria, J. C., Sutton, R. S. and Ram A. (1996). Experiments with Reinforcement Learning In Problems with Continuous State and Actions Spaces. Technical Report UM-CS-1996-088, Department of Computer Science, University of Massachusetts, Amherst, MA.
Wiering, M, Salustowicz, R. and Schmidhuber, J. (1999). Reinforcement Learning Soccer Teams with Incomplete World Models. Autonomous Robots, 7, 77–88.
Yserentant, H.: On the Multi-level Splitting of Finite Element Spaces. Numer. Math. 49, (1986), 379–412.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Cornforth, D. (2002). Evolution in the Orange Box — A New Approach to the Sphere-Packing Problem in CMAC-Based Neural Networks. In: McKay, B., Slaney, J. (eds) AI 2002: Advances in Artificial Intelligence. AI 2002. Lecture Notes in Computer Science(), vol 2557. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36187-1_29
Download citation
DOI: https://doi.org/10.1007/3-540-36187-1_29
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-00197-3
Online ISBN: 978-3-540-36187-9
eBook Packages: Springer Book Archive