Skip to main content

Information Geometry and Its Applications: Convex Function and Dually Flat Manifold

  • Chapter
Emerging Trends in Visual Computing (ETVC 2008)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 5416))

Included in the following conference series:

Abstract

Information geometry emerged from studies on invariant properties of a manifold of probability distributions. It includes convex analysis and its duality as a special but important part. Here, we begin with a convex function, and construct a dually flat manifold. The manifold possesses a Riemannian metric, two types of geodesics, and a divergence function. The generalized Pythagorean theorem and dual projections theorem are derived therefrom. We construct alpha-geometry, extending this convex analysis. In this review, geometry of a manifold of probability distributions is then given, and a plenty of applications are touched upon. Appendix presents an easily understable introduction to differential geometry and its duality.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Amari, S.: Differential-Geometrical Methods in Statistics. Lecture Notes in Statistics, vol. 28. Springer, Heidelberg (1985)

    MATH  Google Scholar 

  • Amari, S.: Differential geometry of a parametric family of invertible linear systems-Riemannian metric, dual affine connections and divergence. Mathematical Systems Theory 20, 53–82 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  • Amari, S.: Information geometry of the EM and em algorithms for neural networks. Neural Networks 8-9, 1379–1408 (1995)

    Article  Google Scholar 

  • Amari, S.: Natural gradient works efficiently in learning. Neural Computation 10, 251–276 (1998)

    Article  Google Scholar 

  • Amari, S.: Information geometry on hierarchy of probability distributions. IEEE Transactions on Information Theory 47, 1701–1711 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  • Amari, S., Kawanabe, M.: Information geometry of estimating functions in semi parametric statistical models. Bernoulli. 3(1), 29–54 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  • Amari, S., Kurata, K., Nagaoka, H.: Information geometry of Boltzmann machines. IEEE Transactions on Neural Networks 3, 260–271 (1992)

    Article  Google Scholar 

  • Amari, S., Nagaoka, H.: Methods of Information Geometry. Translations of Mathematical Monographs, vol. 191. AMS & Oxford University Press (2000)

    Google Scholar 

  • Amari, S., Park, H., Ozeki, T.: Singularities affect dynamics of learning in neuromanifolds. Neural Computation 18, 1007–1065 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  • Bregman, L.M.: The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming. USSR Computational Mathematics and Physics 7, 200–217 (1967)

    Article  MathSciNet  MATH  Google Scholar 

  • Byrne, W.: Alternating minimization and Boltzmann machine learning. IEEE Transactions on Neural Networks 3, 612–620 (1992)

    Article  Google Scholar 

  • Chentsov (ÄŒencov), N.N.: Statistical Decision Rules and Optimal Inference. American Mathematical Society, Rhode Islandm U.S.A. (1982); originally published in Russian, Nauka, Moscow (1972)

    Google Scholar 

  • Cichocki, A., Amari, S.: Adaptive Blind Signal and Image Processing. John Wiley, Chichester (2002)

    Book  Google Scholar 

  • Csiszár, I.: I-divergence geometry of probability distributions and minimization problems. The Annals of Probability 3, 146–158 (1975)

    Article  MathSciNet  MATH  Google Scholar 

  • Csiszár, I., Tusnády, G.: Information geometry and alternating minimization procedures. In: Dedewicz, E.F., et al. (eds.) Statistics and Decisions, vol. (1), pp. 205–237. R. Oldenbourg Verlag, Munich (1984)

    Google Scholar 

  • Ikeda, S., Tanaka, T., Amari, S.: Stochastic reasoning, free energy, and information geometry. Neural Computation 16, 1779–1810 (2004)

    Article  MATH  Google Scholar 

  • Lebanon, G.: Riemannian Geometry and Statistical Machine Learning. CMU-LTI-05-189, Carnegie-Mellon University (2005)

    Google Scholar 

  • Lebanon, G., Lafferty, J.: Boosting and maximum likelihood for exponential models. Advances in Neural Information Processing Systems, vol. 14, pp. 447–451. MIT Press, Cambridge (2002)

    Google Scholar 

  • Matsushima, Y.: The alpha EM algorithms: Surrogate likelihood maximization using alpha-logarithmic information measures. IEEE Transactions on Information Theory 49, 692–706 (2003)

    Article  MathSciNet  Google Scholar 

  • Murata, N., Takenouchi, T., Kanamori, T., Eguchi, S.: Information geometry of U-boost and Bregman divergence. Neural Computation 16, 1437–1481 (2004)

    Article  MATH  Google Scholar 

  • Nakahara, H., Amari, S.: Information-geometric measure for neural spikes. Neural Computation 14, 2269–2316 (2002)

    Article  MATH  Google Scholar 

  • Nomizu, K., Sasaki, T.: Affine Differential Geometry. Cambridge University Press, Cambridge (1994)

    MATH  Google Scholar 

  • Ohara, A.: Geodesics for dual connections and means on symmetric cone. Interal Equation and Operator Theory 50, 537–548 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  • Ohara, A., Tsuchiya, T.: An information geometric approach to polynomial-time interior-pint algorithms (submitted, 2008)

    Google Scholar 

  • Rényi, A.: On measures of entropy and information. In: Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 547–561. University of California Press (1961)

    Google Scholar 

  • Tsallis, C.: Possible generalization of Boltzmann-Gibbs statistics. Journal of Statistical Physics 52, 479 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  • Wu, S., Amari, S.: Conformal transformation of kernel functions: A data-dependent way to improve support vector machine classifiers. Neural Processing Letters 15, 59–67 (2002)

    Article  MATH  Google Scholar 

  • Zhang, J.: Divergence function, duality, and convex analysis. Neural Computation 16, 159–195 (2004)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Amari, Si. (2009). Information Geometry and Its Applications: Convex Function and Dually Flat Manifold. In: Nielsen, F. (eds) Emerging Trends in Visual Computing. ETVC 2008. Lecture Notes in Computer Science, vol 5416. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-00826-9_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-00826-9_4

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-00825-2

  • Online ISBN: 978-3-642-00826-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics