Abstract
We discuss an algorithm which allows us to find the algebraic expression of a dependent variable as a function of an arbitrary number of independent ones where data is arbitrary, i.e. it may have arisen from experimental data. The possibility of such approximation is proved starting from the Universal Approximation Theorem (UAT). As opposed to the neural network (NN) approach to which it is frequently associated, the relationship between the independent variables is explicit, thus resolving the “black box” characteristics of NNs. It implies the use of a nonlinear function (called the activation function) such as the logistic 1/(1+e− x). Thus, any function is expressible as a combination of a set of logistics. We show that a close polynomial approximation of logistic is possible by using only a constant and monomials of odd degree. Hence, an upper bound (D) on the degree of the polynomial may be found. Furthermore, we may calculate the form of the model resulting from D. We discuss how to determine the best such set by using a genetic algorithm leading to the best L∞- L2 approximation. It allows us to find the best approximation polynomial given a selected fixed number of coefficients. It then finds the best combination of coefficients and their values. We present some experimental results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning Internal Representations by Error Propagation. In: Rumelhart, D., McClelland, J., the PDP research group (eds.) Parallel Distributed Processing: Explorations in the Microstructure of Cognition. Foundations, vol. 1. MIT Press (1986)
Haykin, S.: Neural Networks and Learning Machines. In: Multilayer Perceptrons, ch. 4, 3rd edn., Prentice Hall (2009) ISBN-13: 978-0-13-147139-9
Powell, M.J.D.: The theory of radial basis functions. In: Light, W. (ed.) Advances in Numerical Analysis II: Wavelets, Subdivision, and Radial Basis Functions. King Fahd University of Petroleum & Minerals (1992)
Haykin, S.: op. cit, ch. 5. Kernel Methods and Radial Basis Functions (2009)
Cortes, C., Vapnik, V.: Support-Vector Networks. Machine Learning 20 (1995), http://www.springerlink.com/content/k238jx04hm87j80g/
Haykin, S.: op. cit, ch. 6. Support Vector Machines (2009)
MacKay, D.: Information Theory, Inference, and Learning Algorithms, Cambridge (2004) ISBN 0-521-64298-1
Ratkowsky, D.: Handbook of Nonlinear Regression Models. Marcel Dekker, Inc., New York (1990); Library of Congress QA278.2.R369
Beckermann, B.: The condition number of real Vandermonde, Krylov and positive definite Hankel matrices. Numerische Mathematik 85(4), 553–577 (2000), doi:10.1007/PL00005392
Meyer, C.: Matrix Analysis and Applied Linear Algebra, Society for Industrial and Applied Mathematics, SIAM (2001) ISBN 978-0-89871-454-8
Kuri-Morales, A.: Training Neural Networks using Non-Standard Norms- Preliminary Results. In: Cairó, O., Cantú, F.J. (eds.) MICAI 2000. LNCS (LNAI), vol. 1793, pp. 350–364. Springer, Heidelberg (2000)
Cybenko, G.: Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals and Systems 2, 303–314 (1989)
Bishop, E.: A generalization of the Stone–Weierstrass theorem. Pacific Journal of Mathematics 11(3), 777–783 (1961)
Koornwinder, T., Wong, R., Koekoek, R., Swarttouw, R.: Orthogonal Polynomials. In: Olver, F., Lozier, D., Boisvert, R., et al. (eds.) NIST Handbook of Mathematical Functions. Cambridge University Press (2010) ISBN 978-0521192255
Scheid, F.: Numerical Analysis. In: Least Squares Polynomial Approximation, ch. 21. Schaum’s Outline Series (1968) ISBN 07-055197-9
Kuri-Morales, A., Aldana-Bobadilla, E.: The Best Genetic Algorithm Part I. In: Castro, F., Gelbukh, A., González, M. (eds.) MICAI 2013, Part II. LNCS, vol. 8266, pp. 1–15. Springer, Heidelberg (2013)
Kuri-Morales, A.F., Aldana-Bobadilla, E., López-Peña, I.: The Best Genetic Algorithm Part II. In: Castro, F., Gelbukh, A., González, M. (eds.) MICAI 2013, Part II. LNCS, vol. 8266, pp. 16–29. Springer, Heidelberg (2013)
Cheney, E.W.: Introduction to Approximation Theory, pp. 34–45. McGraw-Hill Book Company (1966)
Bache, K., Lichman, M.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine (2013), http://archive.ics.uci.edu/ml
http://www.math.nus.edu.sg/~matngtb/Calculus/MA3110/Chapter10WeierstrassApproximation.pdf
Masry, E.: Multivariate Local Polynomial Regression For Time Series: Uniform Strong Consistency And Rates. J. Time Series Analysis 17, 571–599 (1996)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Kuri-Morales, A., Cartas-Ayala, A. (2014). Polynomial Multivariate Approximation with Genetic Algorithms. In: Sokolova, M., van Beek, P. (eds) Advances in Artificial Intelligence. Canadian AI 2014. Lecture Notes in Computer Science(), vol 8436. Springer, Cham. https://doi.org/10.1007/978-3-319-06483-3_30
Download citation
DOI: https://doi.org/10.1007/978-3-319-06483-3_30
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-06482-6
Online ISBN: 978-3-319-06483-3
eBook Packages: Computer ScienceComputer Science (R0)