Abstract
This paper proposes a new method for finding polynomials to fit multivariate data containing numeric and nominal variables. Each polynomial is accompanied with the corresponding nominal condition stating when to apply the polynomial. Such a nominally conditioned polynomial is called a rule. A set of such rules can be regarded as a single numeric function, and such a function can be closely approximated by a single three-layer neural network. After training single neural networks with different numbers of hidden units, the method selects the best trained network, and restores the final rules fromi t. Experiments using three data sets show that the proposed method works well in finding very succinct and interesting rules, even fromda ta containing irrelevant variables and a small amount of noise.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
C. M. Bishop. Neural networks for pattern recognition. Clarendon Press, Oxford, 1995.
B. C. Falkenhainer and R. S. Michalski. Integrating quantitative and qualitative discovery in the abacus system. In Machine Learning: An Artificial Intelligence Approach (Vol. 3), pages 153–190. Morgan Kaufmann, 1990.
P. Langley, H. A. Simon, G. Bradshaw, and J. Zytkow. Scientific discovery: computational explorations of the creative process. MIT Press, 1987.
S. P. Lloyd. Least squares quantization in pcm. IEEE Trans. on Information Theory, IT-28(2):129–137, 1982.
D. G. Luenberger. Linear and nonlinear programming. Addison-Wesley, 1984.
R. Nakano and K. Saito. Computational characteristics of law discovery using neural networks. In Proc. 1st Int. Conference on Discovery Science, LNAI 1532, pages 342–351, 1998.
B. Nordhausen and P. Langley. An integrated framework for empirical discovery. Machine Learning, 12:17–47, 1993.
J. R. Quinlan. C4.5: programs for machine learning. Morgan Kaufmann, 1993.
K. Saito and R. Nakano. Law discovery using neural networks. In Proc. 15th International Joint Conference on Artificial Intelligence, pages 1078–1083, 1997.
K. Saito and R. Nakano. Partial BFGS update and efficient step-length calculation for three-layer neural networks. Neural Computation, 9(1):239–257, 1997.
K. Saito and R. Nakano. Discovery of relevant weights by minimizing crossvalidation error. In Proc. PAKDD 2000, LNAI 1805, pages 372–375, 2000.
K. Saito and R. Nakano. Second-order learning algorithmw ith squared penalty term. Neural Computation, 12(3):709–729, 2000.
C. Schaffer. Bivariate scientific function finding in a sampled, real-data testbed. Machine Learning, 12(1/2/3):167–183, 1993.
M. Stone. Cross-validatory choice and assessment of statistical predictions (with discussion). Journal of the Royal Statistical Society B, 64:111–147, 1974.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Nakano, R., Saito, K. (2001). Finding Polynomials to Fit Multivariate Data Having Numeric and Nominal Variables. In: Hoffmann, F., Hand, D.J., Adams, N., Fisher, D., Guimaraes, G. (eds) Advances in Intelligent Data Analysis. IDA 2001. Lecture Notes in Computer Science, vol 2189. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44816-0_26
Download citation
DOI: https://doi.org/10.1007/3-540-44816-0_26
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42581-6
Online ISBN: 978-3-540-44816-7
eBook Packages: Springer Book Archive