Skip to main content

Complex-Valued Embeddings of Generic Proximity Data

  • Conference paper
  • First Online:
Structural, Syntactic, and Statistical Pattern Recognition (S+SSPR 2021)

Abstract

Proximities are at the heart of almost all machine learning methods. In a more generic view, objects are compared by a (symmetric) similarity or dissimilarity measure, which may not obey particular mathematical properties. This renders many machine learning methods invalid, leading to convergence problems and the loss of generalization behavior. In many cases, the preferred dissimilarity measure is not metric. If the input data are non-vectorial, like text sequences, proximity-based learning is used or embedding techniques can be applied. Standard embeddings lead to the desired fixed-length vector encoding, but are costly and are limited in preserving the full information. As an information preserving alternative, we propose a complex-valued vector embedding of proximity data, to be used in respective learning approaches. In particular, we address supervised learning and use extensions of prototype-based learning. The proposed approach is evaluated on a variety of standard benchmarks showing good performance compared to traditional techniques in processing non-metric or non-psd proximity data.

MM is supported by the ESF (WiT-HuB 4/2014–2020), project KI-trifft-KMU, StMBW-W-IX.4-6-190065. M.B. and M.S. acknowledge support through the Northern Netherlands Region of Smart Factories (RoSF) consortium, lead by Noordelijke Ontwikkelings en Investerings Maatschappij (NOM), The Netherlands, see http://www.rosf.nl.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    The associated similarity matrix can be obtained by double centering [12] of the dissimilarity matrix. \(\mathbf {S} = -\mathbf {J} \mathbf {D} \mathbf {J}/2\) with \(\mathbf {J} = (\mathbf {I}-\mathbf {1}\mathbf {1}^\top /N)\), identity matrix \(\mathbf {I}\) and vector of ones \(\mathbf {1}\).

  2. 2.

    Some heuristic ideas on Landmark MDS, which is imprecise, are discussed in [2].

References

  1. Duin, R.P.: PRTools (2012). http://www.prtools.org

  2. Gisbrecht, A., Schleif, F.: Metric and non-metric proximity transformations at linear costs. Neurocomputing 167, 643–657 (2015)

    Article  Google Scholar 

  3. Goldfarb, L.: A unified approach to pattern recognition. Pattern Recogn. 17(5), 575–582 (1984)

    Article  MathSciNet  Google Scholar 

  4. Hofmann, T., Buhmann, J.M.: Pairwise data clustering by deterministic annealing. IEEE Trans. Pattern Anal. Mach. Intell. 19(1), 1–14 (1997)

    Article  Google Scholar 

  5. Iosifidis, A., Gabbouj, M.: Nyström-based approximate kernel subspace learning. Pattern Recogn. 57, 190–197 (2016)

    Article  Google Scholar 

  6. Jain, A., Zongker, D.: Representation and recognition of handwritten digits using deformable templates. IEEE Trans. Pattern Anal. Mach. Intell. 19(12), 1386–1391 (1997)

    Article  Google Scholar 

  7. Kar, P., Jain, P.: Similarity-based learning via data driven embeddings. In: Proceedings of Advances in Neural Information Processing Systems 24: 25th NIPS 2011, Granada, Spain, pp. 1998–2006 (2011)

    Google Scholar 

  8. Mokbel, B.: Dissimilarity-based learning for complex data. Ph.D. thesis, Bielefeld University (2016). https://nbn-resolving.de/urn:nbn:de:hbz:361-29004254

  9. Münch, M., Raab, C., Biehl, M., Schleif, F.: Structure preserving encoding of non-euclidean similarity data. In: Proceedings of 9th ICPRAM 2020, pp. 43–51 (2020)

    Google Scholar 

  10. Neuhaus, M., Bunke, H.: Edit distance based kernel functions for structural pattern classification. Pattern Recogn. 39(10), 1852–1863 (2006)

    Article  Google Scholar 

  11. Oglic, D., Gärtner, T.: Scalable learning in reproducing Kernel Krein spaces. In: Proceedings of the 36th ICML 2019, USA, pp. 4912–4921 (2019)

    Google Scholar 

  12. Pekalska, E., Duin, R.: The Dissimilarity Representation for Pattern Recognition. World Scientific (2005)

    Google Scholar 

  13. Sato, A., Yamada, K.: Generalized Learning Vector Quantization. In: Proceedings of 8th NIPS 1995 (NIPS’95), pp. 423–429. MIT Press, Cambridge, MA, USA (1995)

    Google Scholar 

  14. Schleif, F., Tiño, P.: Indefinite proximity learning: a review. Neural Comput. 27(10), 2039–2096 (2015)

    Article  MathSciNet  Google Scholar 

  15. Schneider, P., Biehl, M., Hammer, B.: Adaptive relevance matrices in learning vector quantization. Neural Comput. 21(12), 3532–3561 (2009)

    Article  MathSciNet  Google Scholar 

  16. Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis and Discovery. Cambridge University Press, Cambridge (2004)

    Book  Google Scholar 

  17. Smith, T.F., Waterman, M.S.: Identification of common molecular subsequences. J. Mol. Biol. 147(1), 195–197 (1981)

    Article  Google Scholar 

  18. Straat, M., et al.: Learning vector quantization and relevances in complex coefficient space. Neural Comput. Appl. 32, 18085–18099 (2019)

    Article  Google Scholar 

  19. Trabelsi, C., et al.: Deep complex networks. In: 6th ICLR 2018 (2018)

    Google Scholar 

  20. van Veen, R., et al.: An application of generalized matrix learning vector quantization in neuroimaging. Comp. Meth. Progr. Biomed. 197, 105708 (2020)

    Article  Google Scholar 

  21. Wirtinger, W.: Zur formalen Theorie der Funktionen von mehr komplexen Veränderlichen. Math. Ann. 97, 357–376 (1927)

    Article  MathSciNet  Google Scholar 

  22. Zhang, L., Zhou, W., Jiao, L.: Complex-valued support vector classifiers. Digit. Signal Process. 20(3), 944–955 (2010)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maximilian Münch .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Münch, M., Straat, M., Biehl, M., Schleif, FM. (2021). Complex-Valued Embeddings of Generic Proximity Data. In: Torsello, A., Rossi, L., Pelillo, M., Biggio, B., Robles-Kelly, A. (eds) Structural, Syntactic, and Statistical Pattern Recognition. S+SSPR 2021. Lecture Notes in Computer Science(), vol 12644. Springer, Cham. https://doi.org/10.1007/978-3-030-73973-7_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-73973-7_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-73972-0

  • Online ISBN: 978-3-030-73973-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics