Skip to main content

Advertisement

Log in

An energy-based SOM model not requiring periodic boundary conditions

  • WSOM 2017
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

We present the Resilient Self-organizing Tissue (ReST) model, a self-organized neural model based on an infinitely often continuously differentiable (\(C^\infty\)) energy function. ReST extends older work on energy-based self-organizing models (IEEE international conference on neural networks, IEEE, pp 1219–1223, 1993) in several ways. First of all, it converts input–prototype distances into neural activities that are constrained to follow a log-normal distribution. This allows a problem-independent interpretation of neural activities which facilitates, e.g. outlier detection and visualization. And secondly, since all neural activities are constrained in particular to exhibit a predetermined temporal mean, the convolution that is contained in the energy function can be performed using the so-called zero-padding with correction (ZPC) instead of periodic boundary conditions. Since periodic boundary conditions impose much stronger constraints on prototypes, using ReST with ZPC leads to markedly lower quantization errors, especially for small map sizes. Additional experiments are conducted showing the worth of a \(C^\infty\) energy function, namely for novelty detection and automatic control of SOM parameters.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Abadi M, Barham P, Chen J, Chen Z, Davis A, Dean J, Devin M, Ghemawat S, Irving G, Isard M et al (2016) Tensorflow: a system for large-scale machine learning. OSDI 16:265–283

    Google Scholar 

  2. Acharya S, Pant AK, Gyawali PK (2015) Deep learning based large scale handwritten Devanagari character recognition. 2015 9th International conference on software, knowledge, information management and applications (SKIMA). IEEE, 2015

  3. Bunte K, Haase S, Biehl M, Villmann T (2012) Stochastic neighbor embedding (SNE) for dimension reduction and visualization using arbitrary divergences. Neurocomputing 90:23–45 Advances in artificial neural networks, machine learning, and computational intelligence (ESANN 2011)

    Article  Google Scholar 

  4. Cohen G, Afshar S, Tapson J, Van Schaik A (2017) EMNIST: extending MNIST to handwritten letters. In: Proceedings of the international joint conference on neural networks 2017, May, pp 2921–2926

  5. Cottrell M, Fort J-C, Pagès G (1998) Theoretical aspects of the SOM algorithm. Neurocomputing 21(1):119–138

    Article  Google Scholar 

  6. Erwin E, Obermayer K, Schulten K (1992) Self-organizing maps: ordering, convergence properties and energy functions. Biol Cybern 67(1):47–55

    Article  Google Scholar 

  7. Flexer A (2001) On the use of self-organizing maps for clustering and visualization. Intell Data Anal 5(5):373–384

    Article  Google Scholar 

  8. Gepperth A, Karaoguz CA (2016) Bio-inspired incremental learning architecture for applied perceptual problems. Cogn Comput 8(5):924–934

    Article  Google Scholar 

  9. Goodfellow I, Bengio Y, Courville A (2016) Deep learning. MIT Press, Cambridge. http://www.deeplearningbook.org

  10. Graepel T, Burger M, Obermayer K (1998) Self-organizing maps: generalizations and new optimization techniques. Neurocomputing 21(1–3):173–190

    Article  Google Scholar 

  11. Heskes TM, Kappen B (1993) Error potentials for self-organization. In: IEEE international conference on neural networks, 1993, IEEE, pp 1219–1223

  12. Heskes T (1999) Energy functions for self-organizing maps. In: Oja E, Kaski S (eds) Kohonen maps. Elsevier, Amsterdam, pp 303–315

    Chapter  Google Scholar 

  13. Jähne B (2005) Digital image processing, 6th edn. Springer, Berlin

    MATH  Google Scholar 

  14. Kohonen T (1982) Self-organized formation of topologically correct feature maps. Biol Cybern 43:59–69

    Article  MathSciNet  Google Scholar 

  15. LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324

    Article  Google Scholar 

  16. Lefort M, Hecht T, Gepperth A(2015) Using self-organizing maps for regression: the importance of the output function. In: European symposium on artificial neural networks (ESANN)

  17. Shieh S-L, Liao I-E (2012) A new approach for data clustering and visualization using self-organizing maps. Expert Syst Appl 39(15):11924–11933

    Article  Google Scholar 

  18. Tolat V (1990) An analysis of Kohonen’s self-organizing maps using a system of energy functions. Biol Cybern 64(2):155–164

    Article  Google Scholar 

  19. Van der Maaten L (2014) Accelerating t-SNE using tree-based algorithms. J Mach Learn Res 15(1):3221–3245

    MathSciNet  MATH  Google Scholar 

  20. Van der Maaten L, Hinton G (2008) Visualizing data using \(t\)-SNE. J Mach Learn Res 9:2579–2605

    MATH  Google Scholar 

  21. Vesanto J (1999) SOM-based data visualization methods. Intell Data Anal 3(2):111–126

    Article  Google Scholar 

  22. Xiao H, Rasul K, Vollgraf R (2017) Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms. arXiv:1708.07747

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexander Gepperth.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gepperth, A. An energy-based SOM model not requiring periodic boundary conditions. Neural Comput & Applic 32, 18045–18058 (2020). https://doi.org/10.1007/s00521-019-04028-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-019-04028-9

Keywords

Navigation