Skip to main content

Enhanced Metric Learning via Dempster-Shafer Evidence Theory

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2018)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11303))

Included in the following conference series:

  • 2182 Accesses

Abstract

Metric learning is a hot topic in machine learning. A proper learned metric can measure the similarity between samples better and hence significantly improves the performance of machine learning algorithm. In this paper, we propose a novel enhanced distance metric learning method via Dempster-Shafer (D-S) evidence theory. We consider each instance as an independent source of evidence and combine these pieces of evidence by using Dempster’s rule. Firstly, with reference to the D-S theory, we construct the balanced weight function corresponding to each instance in the metric. Secondly, the novel competitive-cost function is given, which can improve classifier accuracy by narrowing the inner-class distance and increasing the inter-class distance. Finally, we implement a series of experiments on classification by using UCI and face recognition data sets. Experimental results validate that the proposed method can significantly improve the performance of the classifier and the robustness of the algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory. 13, 21–27 (1967)

    Article  Google Scholar 

  2. Broomhead, D.S., Lowe, D.: Multivariable functional interpolation and adaptive networks. J. Comp. Syst. 2(3), 321–355 (1988)

    MathSciNet  MATH  Google Scholar 

  3. Cortes, C., Vapnik, V.: Support vector networks. J. Mach. Learn. 20(3), 273–297 (1995)

    MATH  Google Scholar 

  4. Xing, E.P., Jordan, M.I., Russell, S.J., Ng, A.Y.: Distance metric learning with application to clustering with side-information. In: Advances in Neural Information Processing Systems, pp. 521–528 (2003)

    Google Scholar 

  5. Liao, S.C., Hu, Y., Zhu, X.Y., Li, S.Z.: Person re-identification by local maximal occurrence representation and metric learning. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2197–2206. IEEE Computer Society, Boston (2015)

    Google Scholar 

  6. Li, Z., Chang, S.Y., Liang, F., Huang, T.S., Cao, L.L., Smith, J.R.: Learning locally-adaptive decision functions for person verification. In: IEEE Conference on Computer Vision and Pattern Recognition. vol. 9, pp. 3610–3617. IEEE Computer Society, Portland (2013)

    Google Scholar 

  7. Zheng, W.S., Gong, S.G., Xiang, T.: Reidentification by relative distance comparison. IEEE Trans. Pattern Anal. Mach. Intell. 35(3), 653–668 (2013)

    Article  Google Scholar 

  8. Hu, L.F., Hu, J., Ye, Z., Shen, C.M., Peng, Y.X.: Performance analysis for SVM combining with metric learning. Neural Process. Lett. 3, 1–12 (2018)

    Google Scholar 

  9. Bar-Hillel, A., Hertz, T., Shental, N., Weinshall, D.: Learning distance functions using equivalence relations. In: International Conference on Machine Learning, pp. 11–18. ACM, Atlanta (2003)

    Google Scholar 

  10. Jolliffe, I.T.: Principal Component Analysis. Springer, New York, vol. 87, no. 100, pp. 41–64 (2005)

    Google Scholar 

  11. Weinberger, K.Q., Saul, L.K.: Distance metric learning for large margin nearest neighbor classification. J. Mach. Learn. Res. 10(1), 207–244 (2009)

    MATH  Google Scholar 

  12. Davis, J.V., Kulis, B., Jain, P., Sra, S., Dhillon, I.S.: Information-theoretic metric learning. In: International Conference on Machine Learning, vol. 227, pp. 209–216. ACM, Corvalis (2007)

    Google Scholar 

  13. Guillaumin, M., Verbeek, J., Schmid, C.: Is that you? metric learning approaches for face identification. In: International Conference on Computer Vision, vol. 30, pp. 498–505. IEEE, Sydney (2011)

    Google Scholar 

  14. Qi, G.J., Tang, J., Zha, Z.J., Chua, T.S., Zhang, H.J.: An efficient sparse metric learning in high-dimensional space via l1-penalized log-determinant regularization. In: International Conference on Machine Learning, vol. 382, pp. 841–848. ACM, Montreal (2009)

    Google Scholar 

  15. Kostinger, M., Hirzer, M., Wohlhart, P., Roth, P.M., Bischof, H.: Large scale metric learning from equivalence constraints. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2288–2295. IEEE Computer Society, Providence (2012)

    Google Scholar 

  16. Liu, E.Y., Guo, Z., Zhang, X., Jojic, V., Wang, W.: Metric learning from relative comparisons by minimizing squared residual. In: IEEE International Conference on Data Mining, vol. 5, pp. 978–983. IEEE Computer Society, Brussels (2012)

    Google Scholar 

  17. Ying, S.H., Wen, Z.J., Shi, J., Peng, Y.X., Peng, J.G., Qiao, H.: Manifold preserving: an intrinsic approach for semisupervised distance metric learning. IEEE Trans. Neural Netw. Learn. Syst. 29(7), 2731–2742 (2017)

    MathSciNet  Google Scholar 

  18. Shafer, G.: A Mathematical Theory of Evidence. Princeton University Press, Princeton (1976)

    MATH  Google Scholar 

  19. Denœux, T.: A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans. Syst. Man Cybern. 25(5), 804–813 (1995)

    Article  Google Scholar 

  20. Lian, C.F., Su, R., Denœux, T.: Dissimilarity metric learning in the belief function framework. IEEE Trans. Fuzzy. Syst. 24(6), 1555–1564 (2016)

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported in part by Shanghai Sailing Program (16YF1404000), the National Natural Science Foundation of China under Grants Nos. 11771276, 11471208 and 11601315.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yaxin Peng .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, Y., Zhang, Y., Peng, Y. (2018). Enhanced Metric Learning via Dempster-Shafer Evidence Theory. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11303. Springer, Cham. https://doi.org/10.1007/978-3-030-04182-3_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-04182-3_35

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-04181-6

  • Online ISBN: 978-3-030-04182-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics