Abstract
Metric learning is a hot topic in machine learning. A proper learned metric can measure the similarity between samples better and hence significantly improves the performance of machine learning algorithm. In this paper, we propose a novel enhanced distance metric learning method via Dempster-Shafer (D-S) evidence theory. We consider each instance as an independent source of evidence and combine these pieces of evidence by using Dempster’s rule. Firstly, with reference to the D-S theory, we construct the balanced weight function corresponding to each instance in the metric. Secondly, the novel competitive-cost function is given, which can improve classifier accuracy by narrowing the inner-class distance and increasing the inter-class distance. Finally, we implement a series of experiments on classification by using UCI and face recognition data sets. Experimental results validate that the proposed method can significantly improve the performance of the classifier and the robustness of the algorithm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory. 13, 21–27 (1967)
Broomhead, D.S., Lowe, D.: Multivariable functional interpolation and adaptive networks. J. Comp. Syst. 2(3), 321–355 (1988)
Cortes, C., Vapnik, V.: Support vector networks. J. Mach. Learn. 20(3), 273–297 (1995)
Xing, E.P., Jordan, M.I., Russell, S.J., Ng, A.Y.: Distance metric learning with application to clustering with side-information. In: Advances in Neural Information Processing Systems, pp. 521–528 (2003)
Liao, S.C., Hu, Y., Zhu, X.Y., Li, S.Z.: Person re-identification by local maximal occurrence representation and metric learning. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2197–2206. IEEE Computer Society, Boston (2015)
Li, Z., Chang, S.Y., Liang, F., Huang, T.S., Cao, L.L., Smith, J.R.: Learning locally-adaptive decision functions for person verification. In: IEEE Conference on Computer Vision and Pattern Recognition. vol. 9, pp. 3610–3617. IEEE Computer Society, Portland (2013)
Zheng, W.S., Gong, S.G., Xiang, T.: Reidentification by relative distance comparison. IEEE Trans. Pattern Anal. Mach. Intell. 35(3), 653–668 (2013)
Hu, L.F., Hu, J., Ye, Z., Shen, C.M., Peng, Y.X.: Performance analysis for SVM combining with metric learning. Neural Process. Lett. 3, 1–12 (2018)
Bar-Hillel, A., Hertz, T., Shental, N., Weinshall, D.: Learning distance functions using equivalence relations. In: International Conference on Machine Learning, pp. 11–18. ACM, Atlanta (2003)
Jolliffe, I.T.: Principal Component Analysis. Springer, New York, vol. 87, no. 100, pp. 41–64 (2005)
Weinberger, K.Q., Saul, L.K.: Distance metric learning for large margin nearest neighbor classification. J. Mach. Learn. Res. 10(1), 207–244 (2009)
Davis, J.V., Kulis, B., Jain, P., Sra, S., Dhillon, I.S.: Information-theoretic metric learning. In: International Conference on Machine Learning, vol. 227, pp. 209–216. ACM, Corvalis (2007)
Guillaumin, M., Verbeek, J., Schmid, C.: Is that you? metric learning approaches for face identification. In: International Conference on Computer Vision, vol. 30, pp. 498–505. IEEE, Sydney (2011)
Qi, G.J., Tang, J., Zha, Z.J., Chua, T.S., Zhang, H.J.: An efficient sparse metric learning in high-dimensional space via l1-penalized log-determinant regularization. In: International Conference on Machine Learning, vol. 382, pp. 841–848. ACM, Montreal (2009)
Kostinger, M., Hirzer, M., Wohlhart, P., Roth, P.M., Bischof, H.: Large scale metric learning from equivalence constraints. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2288–2295. IEEE Computer Society, Providence (2012)
Liu, E.Y., Guo, Z., Zhang, X., Jojic, V., Wang, W.: Metric learning from relative comparisons by minimizing squared residual. In: IEEE International Conference on Data Mining, vol. 5, pp. 978–983. IEEE Computer Society, Brussels (2012)
Ying, S.H., Wen, Z.J., Shi, J., Peng, Y.X., Peng, J.G., Qiao, H.: Manifold preserving: an intrinsic approach for semisupervised distance metric learning. IEEE Trans. Neural Netw. Learn. Syst. 29(7), 2731–2742 (2017)
Shafer, G.: A Mathematical Theory of Evidence. Princeton University Press, Princeton (1976)
Denœux, T.: A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans. Syst. Man Cybern. 25(5), 804–813 (1995)
Lian, C.F., Su, R., Denœux, T.: Dissimilarity metric learning in the belief function framework. IEEE Trans. Fuzzy. Syst. 24(6), 1555–1564 (2016)
Acknowledgments
This work was supported in part by Shanghai Sailing Program (16YF1404000), the National Natural Science Foundation of China under Grants Nos. 11771276, 11471208 and 11601315.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Li, Y., Zhang, Y., Peng, Y. (2018). Enhanced Metric Learning via Dempster-Shafer Evidence Theory. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11303. Springer, Cham. https://doi.org/10.1007/978-3-030-04182-3_35
Download citation
DOI: https://doi.org/10.1007/978-3-030-04182-3_35
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04181-6
Online ISBN: 978-3-030-04182-3
eBook Packages: Computer ScienceComputer Science (R0)