Skip to main content

Information Geometry of Predictor Functions in a Regression Model

  • Conference paper
  • First Online:
Geometric Science of Information (GSI 2017)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 10589))

Included in the following conference series:

  • 2232 Accesses

Abstract

We discuss an information-geometric framework for a regression model, in which the regression function accompanies with the predictor function and the conditional density function. We introduce the e-geodesic and m-geodesic on the space of all predictor functions, of which the pair leads to the Pythagorean identity for a right triangle spinned by the two geodesics. Further, a statistical modeling to combine predictor functions in a nonlinear fashion is discussed by generalized average, and in particular, we observe the flexible property of the log-exp average.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Amari, S., Nagaoka, H.: Methods of Information Geometry. Oxford University Press, Oxford (2000)

    MATH  Google Scholar 

  2. Eguchi, S.: Geometry of minimum contrast. Hiroshima Math. J. 22, 631–647 (1992)

    MATH  MathSciNet  Google Scholar 

  3. Eguchi, S., Komori, O.: Path connectedness on a space of probability density functions. In: Nielsen, F., Barbaresco, F. (eds.) GSI 2015. LNCS, vol. 9389, pp. 615–624. Springer, Cham (2015). doi:10.1007/978-3-319-25040-3_66

    Chapter  Google Scholar 

  4. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer, New York (2009)

    Book  MATH  Google Scholar 

  5. Nielsen, F., Sun, K.: Guaranteed bounds on the Kullback-Leibler divergence of univariate mixtures using piecewise log-sum-exp inequalities. arXiv preprint arXiv:1606.05850 (2016)

  6. Murphy, K.: Naive Bayes classifiers. University of British Columbia (2006)

    Google Scholar 

  7. Omae, K., Komori, O., Eguchi, S.: Quasi-linear score for capturing heterogeneous structure in biomarkers. BMC Bioinform. 18(1), 308 (2017)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shinto Eguchi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Eguchi, S., Omae, K. (2017). Information Geometry of Predictor Functions in a Regression Model. In: Nielsen, F., Barbaresco, F. (eds) Geometric Science of Information. GSI 2017. Lecture Notes in Computer Science(), vol 10589. Springer, Cham. https://doi.org/10.1007/978-3-319-68445-1_65

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-68445-1_65

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-68444-4

  • Online ISBN: 978-3-319-68445-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics