Skip to main content

Discovering Polynomials to Fit Multivariate Data Having Numeric and Nominal Variables

  • Chapter
  • First Online:
Progress in Discovery Science

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2281))

Abstract

This paper proposes an improved version of a method for discovering polynomials to fit multivariate data containing numeric and nominal variables. Each polynomial is accompanied with the corresponding nominal condition stating when to apply the polynomial. Such a nominally conditioned polynomial is called a rule. A set of such rules can be regarded as a single numeric function, and such a function can be approximated and learned by three-layer neural networks. The method selects the best from those trained neural networks with different numbers of hidden units by a newly introduced double layer of cross-validation, and restores the final rules from the best. Experiments using two data sets show that the proposed method works well in discovering very succinct and interesting rules even from data containing irrelevant variables and a small amount of noise.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. C. M. Bishop. Neural networks for pattern recognition. Clarendon Press, Oxford, 1995.

    Google Scholar 

  2. B. C. Falkenhainer and R. S. Michalski. Integrating quantitative and qualitative discovery in the abacus system. In Machine Learning: An Artificial Intelligence Approach (Vol. 3), pages 153–190. Morgan Kaufmann, 1990.

    Google Scholar 

  3. P. Langley, H. A. Simon, G. Bradshaw, and J. Zytkow. Scientific discovery: computational explorations of the creative process. MIT Press, 1987.

    Google Scholar 

  4. S. P. Lloyd. Least squares quantization in pcm. IEEE Trans. on Information Theory, IT-28(2):129–137, 1982.

    Article  MathSciNet  Google Scholar 

  5. D. G. Luenberger. Linear and nonlinear programming. Addison-Wesley, 1984.

    Google Scholar 

  6. R. Nakano and K. Saito. Computational characteristics of law discovery using neural networks. In Proc. 1st Int. Conference on Discovery Science, LNAI 1532, pages 342–351, 1998.

    Google Scholar 

  7. R. Nakano and K. Saito. Discovery of a set of nominally conditioned polynomials. In Proc. 2nd Int. Conference on Discovery Science, LNAI 1721, pages 287–298, 1999.

    Google Scholar 

  8. R. Nakano and K. Saito. Finding polynomials to fit multivariate data having numeric and nominal variables. In Proc. 4th Int. Symoposium on Intelligent Data Analysis (to appear).

    Google Scholar 

  9. J. R. Quinlan. C4.5: programs for machine learning. Morgan Kaufmann, 1993.

    Google Scholar 

  10. B. D. Ripley. Pattern recognition and neural networks. Cambridge Univ Press, 1996.

    Google Scholar 

  11. K. Saito and P. Langley. Discovering empirical laws of Web dynamics. In Proc. 2002 International Symposium on applications and the Intenet (to appear).

    Google Scholar 

  12. K. Saito, P. Langley, and et al. Computational revision of quantitative scientific models. In Proc. 4th International Conference on Discovery Science (to appear).

    Google Scholar 

  13. K. Saito and R. Nakano. Law discovery using neural networks. In Proc. 15th International Joint Conference on Artificial Intelligence, pages 1078–1083, 1997.

    Google Scholar 

  14. K. Saito and R. Nakano. Partial BFGS update and efficient step-length calculation for three-layer neural networks. Neural Computation, 9(1):239–257, 1997.

    Article  Google Scholar 

  15. K. Saito and R. Nakano. Discovery of a set of nominally conditioned polynomials using neural networks, vector quantizers, and decision trees. In Proc. 3rd Int. Conference on Discovery Science, LNAI 1967, pages 325–329, 2000.

    Google Scholar 

  16. K. Saito and R. Nakano. Discovery of relevant weights by minimizing crossvalidation error. In Proc. PAKDD 2000, LNAI 1805, pages 372–375, 2000.

    Google Scholar 

  17. K. Saito and R. Nakano. Second-order learning algorithm with squared penalty term. Neural Computation, 12(3):709–729, 2000.

    Article  Google Scholar 

  18. K. Saito, N. Ueda, and et al. Law discovery from financial data using neural networks. In Proc. IEEE/IAFE/INFORMS Conference on Computational Intelligence for Financial Engineering, pages 209–212, 2000.

    Google Scholar 

  19. M. Stone. Cross-validatory choice and assessment of statistical predictions (with discussion). Journal of the Royal Statistical Society B, 64:111–147, 1974.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Nakano, R., Saito, K. (2002). Discovering Polynomials to Fit Multivariate Data Having Numeric and Nominal Variables. In: Arikawa, S., Shinohara, A. (eds) Progress in Discovery Science. Lecture Notes in Computer Science(), vol 2281. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45884-0_36

Download citation

  • DOI: https://doi.org/10.1007/3-540-45884-0_36

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43338-5

  • Online ISBN: 978-3-540-45884-5

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics