Abstract
Co-training is a framework for semi-supervised learning that has attracted much attention due to its good performance and easy adaptation for various learning algorithms. In a recent work, Caldas et al. proposed a co-training-based method using the recently proposed supervised learning method named minimal learning machine (MLM). Although the proposed method, referred to as Co-MLM, presented results that are comparable to other semi-supervised algorithms, using MLM as a base learner resulted in a formulation with heavy computational cost. Aiming to mitigate this problem, in this paper, we propose an improved variant of Co-MLM with reduced computational cost on both training and testing phases. The proposed method is compared to Co-MLM and other Co-training-based semi-supervised methods, presenting comparable performances.
Similar content being viewed by others
References
Blum, A., Mitchell, T.: Combining labeled and unlabeled data with co-training. In: Proceedings of the 11th Annual Conference on Computational Learning Theory, ACM, pp. 92–100 (1998)
Li, M., Zhou, Z.H.: Improve computer-aided diagnosis with machine learning techniques using undiagnosed samples. Syst. Man Cybern. Part A Syst. Hum. IEEE Trans. 37(6), 1088–1098 (2007)
Zhou, Z.H., Li, M.: Tri-training: exploiting unlabeled data using three classifiers. Knowl Data Eng IEEE Trans 17(11), 1529–1541 (2005)
Zhou, Y., Goldman, S.: Democratic co-learning. In: Tools with artificial intelligence, 2004. ICTAI 2004. 16th IEEE International Conference on, IEEE, pp. 594–602 (2004)
Caldas, W.L., Cacais, M.G., Mesquita, D.P.P., Gomes, J.P.P.: Co-mlm: a ssl algorithm based on the minimal learning machine. In: Proceedings of Brazilian Conference on Intelligent Systems, Bracis ’16 (2016)
Souza, Jr., A.H., Corona, F., Miche, Y., Lendasse, A., Barreto, G.A., Simula, O.: Minimal learning machine: a new distance-based method for supervised learning. In: Proceedings of the 12th International Conference on Artificial Neural Networks: Advances in Computational Intelligence—Volume Part I, IWANN’13, pp. 408–416. Springer, Berlin, Heidelberg (2013)
Mesquita, D.P.P., Gomes, J.P.P., Souza, Jr., A.H.: A minimal learning machine for datasets with missing values. In: Neural Information Processing: 22nd International Conference, ICONIP 2015, Istanbul, Turkey, November 9–12, 2015, Proceedings, Part I, pp. 565–572. Springer, New York (2015)
Mesquita, D.P.P., Gomes, J.P.P., Junior, A.H.S.: Ensemble of minimal learning machines for pattern classification. In: Advances in Computational Intelligence: 13th International Work-Conference on Artificial Neural Networks, IWANN 2015, Palma de Mallorca, Spain, June 10-12, 2015. Proceedings, Part II, pp. 142–152 (2015)
Souza Júnior, A.H., Corona, F., Barreto, G.A., Miche, Y., Lendasse, A.: Minimal learning machine: a novel supervised distance-based approach for regression and classification. Neurocomputing 164, 34–44 (2015)
Mesquita, D.P.P., Gomes, J.P.P., Souza, Jr., A.H.: Ensemble of efficient minimal learning machines for classification and regression. Neural Process. Lett. 46(3), 751–766 (2017). http://doi.org/10.1007/s11063-017-9587-5
Dasgupta, S., Littman, M.L., McAllester, D.: Pac generalization bounds for co-training. Adv. Neural Inf. Process Syst. 1, 375–382 (2002)
Brefeld, U.: Multi-view learning with dependent views. In: Proceedings of the 30th Annual ACM Symposium on Applied Computing, SAC ’15, pp. 865–870. ACM, New York, NY(2015)
Park, H.S., Jun, C.H.: A simple and fast algorithm for k-medoids clustering. Expert Syst. Appl. 36(2), 3336–3341 (2009)
Oliveira, A.C., Gomes, J.A.P.P., Rocha Neto, A.R., Souza, Jr., A.H.: Efficient minimal learning machines with reject option. In: Proceedings of Brazilian Conference on Intelligent Systems, Bracis ’16 (2016)
Hayes, M.: Recursive least squares. Statistical Digital Signal Processing and Modeling, p. 541 (1996)
Dalitz, C.: Document Image Analysis with the Gamera Framework, chap. Reject Options and Confidence Measures for kNN Classifiers, pp. 16–38. Shaker Verlag (2009)
McCall, C., Reddy, K.K., Shah, M.: Macro-class selection for hierarchical k-nn classification of inertial sensor data. In: Benavente-Peces, C., Ali, F.H., Filipe, J. (eds.) PECCS, pp. 106–114. SciTePress (2012)
Feger, F., Koprinska, I.: Co-training using rbf nets and different feature splits. In: Neural Networks, 2006. IJCNN’06. International Joint Conference on, pp. 1878–1885. IEEE (2006)
Blum, A., Chawla, S.: Learning from labeled and unlabeled data using graph mincuts. In: Proceedings of the 18th International Conference on Machine Learning, ICML ’01, pp. 19–26 (2001)
Nigam, K., McCallum, A.K., Thrun, S., Mitchell, T.: Text classification from labeled and unlabeled documents using em. Mach. Learn. 39(2–3), 103–134 (2000)
Lichman, M.: UCI machine learning repository. University of California, Irvine, School of Information and Computer Sciences (2013). http://archive.ics.uci.edu/ml. Accessed 6 Mar 2016
Grandvalet, Y., Bengio, Y., et al.: Semi-supervised learning by entropy minimization. NIPS 17, 529–536 (2004)
Chapelle, O., Zien, A.: Semi-supervised classification by low density separation. In: AISTATS, pp. 57–64 (2005)
Data.Gov u.s. general services administration. https://www.data.gov. Accessed 31 Jan 2017
Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)
Acknowledgements
The authors acknowledge the support of CNPq (Grant 456837/2014-0 and research fellowship Grant 305048/2016-3).
Author information
Authors and Affiliations
Corresponding author
About this article
Cite this article
Caldas, W.L., Gomes, J.P.P. & Mesquita, D.P.P. Fast Co-MLM: An Efficient Semi-supervised Co-training Method Based on the Minimal Learning Machine. New Gener. Comput. 36, 41–58 (2018). https://doi.org/10.1007/s00354-017-0027-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00354-017-0027-x