Abstract
This paper shows that a connectionist law discovery method called RF6 can discover a law in the form of a set of nominally conditioned polynomials, from data containing both nominal and numeric values. RF6 learns a compound of nominally conditioned polynomials by using single neural networks, and selects the best one among candidate networks, and decomposes the selected network into a set of rules. Here a rule means a nominally conditioned polynomial. Experiments showed that the proposed method works well in discovering such a law even from data containing irrelevant variables and a small amount of noise.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
C.M. Bishop. Neural networks for pattern recognition. Clarendon Press, 1995.
R. Durbin and D. Rumelhart. Product units: a computationally powerful and biologically plausible extension. Neural Computation, 1(1):133–142, 1989.
B.C. Falkenhainer and R.S. Michalski. Integrating quantitative and qualitative discovery in the abacus system. In Machine Learning: An Artificial Intelligence Approach (Vol. 3), pp. 153–190. Morgan Kaufmann, 1990.
P.E. Gill, W. Murray, and M.H. Wright. Practical optimization. Academic Press, 1981.
D. Kibler, D.W. Aha, and M.K. Albert. Instance-based prediction of real-valued attribures. Computational Intelligence, 5:51–57, 1989.
P. Langley, H.A. Simon, G. Bradshaw, and J. Zytkow. Scientific discovery: computational explorations of the creative process. MIT Press, 1987.
L.R. Leerink, C.L. Giles, B.G. Horne, and M.A. Jabri. Learning with product units. In Advances in Neural Information Processing Systems 7, pp. 537–544, 1995.
R. Nakano and K. Saito. Computational characteristics of law discovery using neural networks. In Proc. 1st Int. Conference on Discovery Science, LNAI 1532, pp. 342–351, 1998.
B. Nordhausen and P. Langley. An integrated framework for empirical discovery. Machine Learning, 12:17–47, 1993.
B.D. Ripley. Pattern recognition and neural networks. Cambridge Univ Press, 1996.
D.E. Rumelhart, G.E. Hinton, and R.J. Williams. Learning internal representations by error propagation. In Parallel Distributed Processing, Vol.1, pp. 318–362. MIT Press, 1986.
K. Saito and R. Nakano. Second-order learning algorithm with squared penalty term. In Advances in Neural Information Processing Systems 9, pp. 627–633, 1996.
K. Saito and R. Nakano. Law discovery using neural networks. In Proc. 15th International Joint Conference on Artificial Intelligence, pp. 1078–1083, 1997.
K. Saito and R. Nakano. Partial BFGS update and efficient step-length calculation for three-layer neural networks. Neural Computation, 9(1):239–257, 1997.
C. Schaffer. Bivariate scientific function finding in a sampled, real-data testbed. Machine Learning, 12(1/2/3):167–183, 1993.
J.M. Zytkow. Combining many searches in the FAHRENHEIT discovery system. In Proc. 4th Int. Workshop on Machine Learning, pp. 281–287, 1987.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Nakano, R., Saito, K. (1999). Discovery of a Set of Nominally Conditioned Polynomials. In: Arikawa, S., Furukawa, K. (eds) Discovery Science. DS 1999. Lecture Notes in Computer Science(), vol 1721. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46846-3_26
Download citation
DOI: https://doi.org/10.1007/3-540-46846-3_26
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-66713-1
Online ISBN: 978-3-540-46846-2
eBook Packages: Springer Book Archive