We’re sorry, something doesn't seem to be working properly.

Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.

A New Framework for Classifying Probability Density Functions | SpringerLink
Skip to main content

A New Framework for Classifying Probability Density Functions

  • Conference paper
  • First Online:
Machine Learning and Knowledge Discovery in Databases: Research Track (ECML PKDD 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14169))

  • 1228 Accesses

Abstract

This paper introduces a new framework for classifying probability density functions. The proposed method fits in the class of constrained Gaussian processes indexed by distribution functions. Firstly, instead of classifying observations directly, we consider their isometric transformations which enables us to satisfy both positiveness and unit integral hard constraints. Secondly, we introduce the theoretical proprieties and give numerical details of how to decompose each transformed observation in an appropriate orthonormal basis. As a result, we show that the coefficients are belonging to the unit sphere when equipped with the standard Euclidean metric as a natural metric. Lastly, the proposed methods are illustrated and successfully evaluated in different configurations and with various dataset.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Alexander, A., Lee, J., Lazar, M., Field, A.: Diffusion tensor imaging of the brain. Neurother. J. Am. Soc. Exp. NeuroTher 4, 316–29 (2007)

    Google Scholar 

  2. Alpaydin, E.: Introduction to Machine Learning, 2nd edn. MIT Press, Cambridge, MA (2010)

    MATH  Google Scholar 

  3. Amari, Si.: Differential geometry of statistical inference. In: Prokhorov, J.V., Itô, K. (eds.) Probability Theory and Mathematical Statistics. Lecture Notes in Mathematics. vol 1021. Springer, Berlin, Heidelberg (1983). https://doi.org/10.1007/BFb0072900

  4. Bachoc, F., Gamboa, F., Loubes, J.M., Venet, N.: A gaussian process regression model for distribution inputs. IEEE Trans. Inf. Theor. 64, 6620–6637 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  5. Botev, Z.I., Grotowski, J.F., Kroese, D.P.: Kernel density estimation via diffusion. Ann. Stat. 38, 2916–2957 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  6. Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth International Group, Belmont, CA (1984)

    MATH  Google Scholar 

  7. Cardot, H., Ferraty, F., Sarda, P.: Classification of functional data: a segmentation approach. Comput. Stat. Data Anal. 44, 315–337 (2003)

    MATH  Google Scholar 

  8. Cencov, N.N.: Evaluation of an unknown distribution density from observations. Doklady 3, 1559–1562 (1962)

    Google Scholar 

  9. Chen, L., Li, J.: Fraud detection for credit cards using random forest. J. Financ. Data Science 1, 83–94 (2018)

    Google Scholar 

  10. Djolonga, J., Krause, A., Cevher, V.: High-dimensional Gaussian process bandits, pp. 1025–1033. NIPS2013, Curran Associates Inc., Red Hook, NY, USA (2013)

    Google Scholar 

  11. Fradi, A., Feunteun, Y., Samir, C., Baklouti, M., Bachoc, F., Loubes, J.M.: Bayesian regression and classification using gaussian process priors indexed by probability density functions. Inf. Sci. 548, 56–68 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  12. Holbrook, A., Lan, S., Streets, J., Shahbaba, B.: Nonparametric fisher geometry with application to density estimation. In: Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), pp. 101–110. Proceedings of Machine Learning Research (2020)

    Google Scholar 

  13. Hosmer, D.W., Lemeshow, S., Sturdivant, R.X.: Applied Logistic Regression, 3rd edn. Wiley, Hoboken, NJ (2013)

    Book  MATH  Google Scholar 

  14. James, G., Witten, D., Hastie, T., Tibshirani, R.: An Introduction to Statistical Learning. STS, vol. 103. Springer, New York (2013). https://doi.org/10.1007/978-1-4614-7138-7

    Book  MATH  Google Scholar 

  15. Julian, P.R., Murphy, A.H.: Probability and statistics in meteorology: a review of some recent developments. Bull. Am. Meteorol. Soc. 53, 957–965 (1972)

    Article  Google Scholar 

  16. Jurafsky, D., Martin, J.H.: Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, 3rd edn. Pearson Education, Harlow, England (2020)

    Google Scholar 

  17. Kanagawa, M., Hennig, P., Sejdinovic, D., Sriperumbudur, B.K.: Gaussian processes and kernel methods: A review on connections and equivalences (2018)

    Google Scholar 

  18. Kim, H.J., et al.: Multivariate general linear models (MGLM) on Riemannian manifolds with applications to statistical analysis of diffusion weighted images. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2705–2712. IEEE Computer Society, Los Alamitos, CA, USA (2014)

    Google Scholar 

  19. Kneip, A., Ramsay, J.O.: Combining registration and fitting for functional models. J. Am. Stat. Assoc. 103, 1155–1165 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  20. Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 25, 1097–1105 (2012)

    Google Scholar 

  21. Lagani, V., Fotiadis, D.I., Likas, A.: Functional data analysis via neural networks: an application to speaker identification. Expert Syst. Appl. 39, 9188–9194 (2012)

    Google Scholar 

  22. Mallasto, A., Feragen, A.: Wrapped Gaussian process regression on Riemannian manifolds. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 5580–5588. IEEE Computer Society, Los Alamitos, CA, USA (2018)

    Google Scholar 

  23. Oliva, J.B., Neiswanger, W., Póczos, B., Schneider, J.G., Xing, E.P.: Fast distribution to real regression. In: Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, pp. 706–714 (2014)

    Google Scholar 

  24. Patrangenaru, V., Ellingson, L.: Nonparametric Statistics on Manifolds and their Applications to Object Data Analysis, 1st edn. Chapman & Hall/CRC Monographs on Statistics & Applied Probability, CRC Press Inc, USA (2015)

    Book  MATH  Google Scholar 

  25. Póczos, B., Singh, A., Rinaldo, A., Wasserman, L.: Distribution-free distribution regression. In: Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics, pp. 507–515. Proceedings of Machine Learning Research, Scottsdale, Arizona, USA (2013)

    Google Scholar 

  26. Lopez de Prado, M.: Advances in financial machine learning. Wiley, Hoboken, New Jersey (2018)

    Google Scholar 

  27. Ramsay, J.O., Silverman, B.W.: Functional Data Analysis. Springer, New York (2005). https://doi.org/10.1007/b98888

    Book  MATH  Google Scholar 

  28. Rasmussen, C.E., Williams, C.: Gaussian Processes for Machine Learning. The MIT Press, Cambridge, London (2006)

    MATH  Google Scholar 

  29. Samir, C., Loubes, J.-M., Yao, A.-F., Bachoc, F.: Learning a gaussian process model on the Riemannian manifold of non-decreasing distribution functions. In: Nayak, A.C., Sharma, A. (eds.) PRICAI 2019. LNCS (LNAI), vol. 11671, pp. 107–120. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-29911-8_9

    Chapter  Google Scholar 

  30. Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Adaptive computation and machine learning (2002)

    Google Scholar 

  31. Srivastava, A., Klassen, E., Joshi, S.H., Jermyn, I.H.: Shape analysis of Elastic curves in Euclidean spaces. IEEE Trans. Pattern Anal. Mach. Intell. 33, 1415–1428 (2011)

    Article  Google Scholar 

  32. Terenin, A.: Gaussian processes and statistical decision-making in non-Euclidean spaces. arXiv (2022). https://arxiv.org/abs/2202.10613

  33. Yao, F., Müller, H.G., ling Wang, J.: Functional data analysis for sparse longitudinal data. J. Am. Stat. Assoc. 100, 577–590 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anis Fradi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Fradi, A., Samir, C. (2023). A New Framework for Classifying Probability Density Functions. In: Koutra, D., Plant, C., Gomez Rodriguez, M., Baralis, E., Bonchi, F. (eds) Machine Learning and Knowledge Discovery in Databases: Research Track. ECML PKDD 2023. Lecture Notes in Computer Science(), vol 14169. Springer, Cham. https://doi.org/10.1007/978-3-031-43412-9_30

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-43412-9_30

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-43411-2

  • Online ISBN: 978-3-031-43412-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics