Abstract
In this paper we addressed the issue of applying a stochastic classifier and a local, fuzzy confusion matrix under the framework of multi-label classification. We proposed a novel solution to the problem of correcting Binary Relevance ensembles. The main step of the correction procedure is to compute label-wise competence and cross-competence measures, which model error pattern of the underlying classifier. The method was evaluated using 20 benchmark datasets. In order to assess the efficiency of the introduced model, it was compared against 3 state-of-the-art approaches. The comparison was performed using 4 different evaluation measures. Although the introduced algorithm, as its base algorithm – Binary Relevance, is insensitive to dependencies between labels, the conducted experimental study reveals that the proposed algorithm outperform other methods in terms of Hamming-loss and False Discovery Rate.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Tsoumakas, G., Katakis, I., Vlahavas, I.: Mining multi-label data. In: Maimon, O., Rokach, L. (eds.) Data Mining and Knowledge Discovery Handbook, pp. 667–685. Springer, Heidelberg (2009)
Zhang, M.L., Zhou, Z.H.: Multilabel neural networks with applications to functional genomics and text categorization. IEEE Trans. Knowl. Data Eng. 18, 1338–1351 (2006)
Zhou, Z.-h., Zhang, M.-l.: Multi-instance multilabel learning with application to scene classification. In: Advances in Neural Information Processing Systems 19 (2007)
Wu, J.S., Huang, S.J., Zhou, Z.H.: Genome-wide protein function prediction through multi-instance multi-label learning. IEEE/ACM Trans. Comput. Biol. and Bioinf. 11, 891–902 (2014)
Dembczyński, K., Waegeman, W., Cheng, W., Hüllermeier, E.: On label dependence and loss minimization in multi-label classification. Mach. Learn. 88, 5–45 (2012)
Pillai, I., Fumera, G., Roli, F.: Threshold optimisation for multi-label classifiers. Pattern Recognit. 46, 2055–2065 (2013)
Cherman, A.E., Metz, J., Monard, M.C.: A simple approach to incorporate label dependency in multi-label classification. In: Sidorov, G., Hernández Aguirre, A., García, C.A.R. (eds.) MICAI 2010, Part II. LNCS, vol. 6438, pp. 33–43. Springer, Heidelberg (2010)
Read, J., Pfahringer, B., Holmes, G., Frank, E.: Classifier chains for multi-label classification. Mach. Learn. 85, 333–359 (2011)
Read, J., Martino, L., Luengo, D.: Efficient monte carlo methods for multi-dimensional learning with classifier chains. Pattern Recognit. 47, 1535–1546 (2014)
Woloszynski, T., Kurzynski, M.: A probabilistic model of classifier competence for dynamic ensemble selection. Pattern Recognit. 44, 2656–2668 (2011)
Zadeh, L.: Fuzzy sets. Inf. Control 8, 338–353 (1965)
Hand, D.J., Yu, K.: Idiot’s bayes: not so stupid after all? Int. Stat. Rev. Revue Internationale de Statistique 69, 385 (2001)
Tsoumakas, G., Spyromitros-Xioufis, E., Vilcek, J., Vlahavas, I.: Mulan: a java library for multi-label learning. J. Mach. Learn. Res. 12, 2411–2414 (2011)
Read, J., Peter, R.: (Meka. http://meka.sourceforge.net/). Accessed 29-03-2015
Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms, 1st edn. Wiley Interscience, Hoboken (2004)
Luaces, O., Díez, J., Barranquero, J., del Coz, J.J., Bahamonde, A.: Binary relevance efficacy for multilabel classification. Prog. Artif. Intell. 1, 303–313 (2012)
Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)
Acknowledgements
Computational resources were provided by PL-Grid Infrastructure.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Trajdos, P., Kurzynski, M. (2015). An Extension of Multi-label Binary Relevance Models Based on Randomized Reference Classifier and Local Fuzzy Confusion Matrix. In: Jackowski, K., Burduk, R., Walkowiak, K., Wozniak, M., Yin, H. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2015. IDEAL 2015. Lecture Notes in Computer Science(), vol 9375. Springer, Cham. https://doi.org/10.1007/978-3-319-24834-9_9
Download citation
DOI: https://doi.org/10.1007/978-3-319-24834-9_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-24833-2
Online ISBN: 978-3-319-24834-9
eBook Packages: Computer ScienceComputer Science (R0)