Skip to main content

Advertisement

Log in

A framework for machine learning based on dynamic physical fields

  • Published:
Natural Computing Aims and scope Submit manuscript

Abstract

Despite recent successes and advancements in artificial intelligence and machine learning, this domain remains under continuous challenge and guidance from phenomena and processes observed in natural world. Humans remain unsurpassed in their efficiency of dealing and learning from uncertain information coming in a variety of forms, whereas more and more robust learning and optimisation algorithms have their analytical engine built on the basis of some nature-inspired phenomena. Excellence of neural networks and kernel-based learning methods, an emergence of particle-, swarms-, and social behaviour-based optimisation methods are just few of many facts indicating a trend towards greater exploitation of nature inspired models and systems. This work intends to demonstrate how a simple concept of a physical field can be adopted to build a complete framework for supervised and unsupervised learning methodology. An inspiration for artificial learning has been found in the mechanics of physical fields found on both micro and macro scales. Exploiting the analogies between data and charged particles subjected to gravity, electrostatic and gas particle fields, a family of new algorithms has been developed and applied to classification, clustering and data condensation while properties of the field were further used in a unique visualisation of classification and classifier fusion models. The paper covers extensive pictorial examples and visual interpretations of the presented techniques along with some comparative testing over well-known real and artificial datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. University of California Repository of Machine Learning Databases and Domain Theories, available free at: ftp.ics.uci.edu/pub/machine-learning-databases

References

  • Wheeler JA (1989) Information, physics, quantum: the search for links. Proc of the Workshop on Complexity, Entropy, and the Physics of Information. Santa Fe 3–28

  • Klir GJ, Folger TA (1988) Fuzzy sets, uncertainty, and information. Prentice-Hall International Edition

  • Zurek WH (1989) Complexity, entropy and the physics of information. Proc of the Workshop on Complexity, Entropy, and the Physics of Information. Santa Fe

  • Hochreiter S, Mozer MC (2000) An electric approach to independent component analysis. Proc of the 2nd International Workshop on Independent Component Analysis and Signal Separation, Helsinki 45–50.

  • Principe J, Fisher I, Xu D (2000) Information theoretic learning. In: Haykin S (ed) Unsupervised adaptive filtering. New York

  • Torkkola K, Campbell W (2000) Mutual information in learning feature transformations. Proc of International Conference on Machine Learning, Stanford

  • Torkkola K (2001) Nonlinear feature transforms using maximum mutual information. Proc of IJCNN’2001, Washington DC, USA

  • Shawe-Taylor J, Cristianini N (2004) Kernel methods for pattern analysis. Cambridge University Press

  • Duda RO, Hart PE, Stork DG (2001) Pattern classification. John Wiley & Sons, New York

    MATH  Google Scholar 

  • Feynman RP, Leighton RB, Sands M (1963) The Feynman lectures on physics. Addison Wesley

  • Cunningham SJ, Humphrey MC, Witten IH (1996) Understanding what machine learning produces part 2: Knowledge visualisation techniques

  • Ho TK (2002) Mirage—A tool for interactive pattern recognition from multimedia data. Proc of the Astronomical Data Analysis Software & Systems XII, Baltimore, MD

  • Kittler J (1998) Combining classifiers: a theoretical framework. Pattern Anal Appl 1:18–27

    Article  Google Scholar 

  • Girolami M, He C (2003) Probability density estimation from optimally condensed data samples. IEEE Trans Pattern Anal Machine Intell 25(10):1253–1264

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dymitr Ruta.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ruta, D., Gabrys, B. A framework for machine learning based on dynamic physical fields. Nat Comput 8, 219–237 (2009). https://doi.org/10.1007/s11047-007-9064-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11047-007-9064-6

Keywords

Navigation