Skip to main content

A Brief Philosophical Note on Information

  • Conference paper
  • First Online:
Towards Integrative Machine Learning and Knowledge Extraction

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10344))

Abstract

I will start by posing a question that arose to my attention when, some years ago, I realized the importance of Machine Learning for the future theoretical and applicative fields of Computer science.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Aczel, A.D.: Chance. Thunder’s Mouth Press, New York (2004)

    MATH  Google Scholar 

  2. Boltzmann, L.: Weitere Studien uber das Wärmegleichgewicht unter Gasmolekulen, Sitzungsber. Kais. Akad. Wiss. Wien Math. Naturwiss. Classe 66, 275–370 (1872)

    Google Scholar 

  3. Bonnici, V., Manca, V.: Informational laws of genome structures. Scientific Reports 6, 28840 (2016). http://www.nature.com/articles/srep28840

  4. Brillouin, L.: The negentropy principle of information. J. Appl. Phys. 24, 1152–1163 (1953)

    Article  MATH  Google Scholar 

  5. Brush, S.G., Hall, N.S. (eds.): The Kinetic Theory of Gases: An Anthology of Classical Papers with Historical Commentary. Imperial College Press, London (2003)

    Google Scholar 

  6. Calude, C.S.: Information and Randomness: An Algorithmic Perspective. EATCS Series in Theoretical Computer Science. Springer, Heidelberg (1994)

    Book  MATH  Google Scholar 

  7. Carnot, S.: Reflections on the Motive Power of Heat (English translation from French edition of 1824, with introduction by Lord Kelvin). Wiley, New York (1890)

    Google Scholar 

  8. Cover, T., Thomas, C.: Information Theory. Wiley, New York (1991)

    MATH  Google Scholar 

  9. Feller, W.: An Introduction to Probability Theory and Its Applications. Wiley, New York (1968)

    MATH  Google Scholar 

  10. James, G., Witten, D., Hastie, T., Tibshirani, R.: An Introduction to Statistical Learning. Springer, New York (2013)

    Book  MATH  Google Scholar 

  11. Jaynes, E.T.: Information Theory and Statistical Mechanics. Phys. Rev. 33(5), 620–630 (1957)

    Article  MATH  MathSciNet  Google Scholar 

  12. Holzinger, A., Jurisica, I. (eds.): Interactive Knowledge Discovery and Data Mining in Biomedical Informatics. LNCS, vol. 8401. Springer, Heidelberg (2014)

    MATH  Google Scholar 

  13. Holzinger, A., Hortenhuber, M., Mayer, C., Bachler, M., Wassertheurer, S., Pinho, A.J., Koslicki, D.: On entropy-based data mining. In: [12], pp. 209–226 (2014)

    Google Scholar 

  14. Holzinger, A. (ed.): Machine Learning for Health Informatics. LNAI, vol. 9605. Springer, Cham (2016)

    Google Scholar 

  15. Manca, V.: Infobiotics: Information in Biotic Systems. Springer, Heidelberg (2013)

    Book  MATH  Google Scholar 

  16. Manca, V.: Grammars for discrete dynamics. In: [14], pp. 37–58. Springer, Heidelberg (2016)

    Google Scholar 

  17. Manca, V.: The principles of informational genomics. Theoret. Comput. Sci. (2017)

    Google Scholar 

  18. Manca, V.: An informational proof of H-theorem. Open Access Library (Modern Physics) 4, e3396 (2017)

    Google Scholar 

  19. Sharp, K., Matschinsky, F.: Translation of Ludwig Boltzmann’s Paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium”. Entropy 17, 1971–2009 (2015)

    Article  MathSciNet  Google Scholar 

  20. Schrödinger, E.: What is Life? The Physical Aspect of the Living Cell and Mind. Cambridge University Press, Cambridge (1944)

    MATH  Google Scholar 

  21. Shannon, C.E.: A mathematical theory of communication. Bell. Sys. Tech. J. 27, 623–656 (1948)

    Article  MATH  MathSciNet  Google Scholar 

  22. Schervish, M.J.: Theory of Statistics. Springer, New York (1995)

    Book  MATH  Google Scholar 

  23. Turing, A.M.: On computable numbers, with an application to the entscheidungsproblem. Proc. London Math. Soc. 42(1), 230–265 (1936)

    MATH  MathSciNet  Google Scholar 

  24. Wheeler, J.A.: Information, physics, quantum: The search for links. In: Zurek, W.H. (ed.) Complexity, Entropy, and the Physics of Information. Addison-Wesley, Redwood City (1990)

    Google Scholar 

  25. Wiener, N.: Cybernetics Or Control and Communication in the Animal and the Machine. Hermann, Paris (1948)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vincenzo Manca .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Manca, V. (2017). A Brief Philosophical Note on Information. In: Holzinger, A., Goebel, R., Ferri, M., Palade, V. (eds) Towards Integrative Machine Learning and Knowledge Extraction. Lecture Notes in Computer Science(), vol 10344. Springer, Cham. https://doi.org/10.1007/978-3-319-69775-8_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-69775-8_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-69774-1

  • Online ISBN: 978-3-319-69775-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics