Skip to main content
Log in

Fast Co-MLM: An Efficient Semi-supervised Co-training Method Based on the Minimal Learning Machine

  • Special Feature
  • Published:
New Generation Computing Aims and scope Submit manuscript

Abstract

Co-training is a framework for semi-supervised learning that has attracted much attention due to its good performance and easy adaptation for various learning algorithms. In a recent work, Caldas et al. proposed a co-training-based method using the recently proposed supervised learning method named minimal learning machine (MLM). Although the proposed method, referred to as Co-MLM, presented results that are comparable to other semi-supervised algorithms, using MLM as a base learner resulted in a formulation with heavy computational cost. Aiming to mitigate this problem, in this paper, we propose an improved variant of Co-MLM with reduced computational cost on both training and testing phases. The proposed method is compared to Co-MLM and other Co-training-based semi-supervised methods, presenting comparable performances.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Blum, A., Mitchell, T.: Combining labeled and unlabeled data with co-training. In: Proceedings of the 11th Annual Conference on Computational Learning Theory, ACM, pp. 92–100 (1998)

  2. Li, M., Zhou, Z.H.: Improve computer-aided diagnosis with machine learning techniques using undiagnosed samples. Syst. Man Cybern. Part A Syst. Hum. IEEE Trans. 37(6), 1088–1098 (2007)

    Article  Google Scholar 

  3. Zhou, Z.H., Li, M.: Tri-training: exploiting unlabeled data using three classifiers. Knowl Data Eng IEEE Trans 17(11), 1529–1541 (2005)

    Article  Google Scholar 

  4. Zhou, Y., Goldman, S.: Democratic co-learning. In: Tools with artificial intelligence, 2004. ICTAI 2004. 16th IEEE International Conference on, IEEE, pp. 594–602 (2004)

  5. Caldas, W.L., Cacais, M.G., Mesquita, D.P.P., Gomes, J.P.P.: Co-mlm: a ssl algorithm based on the minimal learning machine. In: Proceedings of Brazilian Conference on Intelligent Systems, Bracis ’16 (2016)

  6. Souza, Jr., A.H., Corona, F., Miche, Y., Lendasse, A., Barreto, G.A., Simula, O.: Minimal learning machine: a new distance-based method for supervised learning. In: Proceedings of the 12th International Conference on Artificial Neural Networks: Advances in Computational Intelligence—Volume Part I, IWANN’13, pp. 408–416. Springer, Berlin, Heidelberg (2013)

  7. Mesquita, D.P.P., Gomes, J.P.P., Souza, Jr., A.H.: A minimal learning machine for datasets with missing values. In: Neural Information Processing: 22nd International Conference, ICONIP 2015, Istanbul, Turkey, November 9–12, 2015, Proceedings, Part I, pp. 565–572. Springer, New York (2015)

  8. Mesquita, D.P.P., Gomes, J.P.P., Junior, A.H.S.: Ensemble of minimal learning machines for pattern classification. In: Advances in Computational Intelligence: 13th International Work-Conference on Artificial Neural Networks, IWANN 2015, Palma de Mallorca, Spain, June 10-12, 2015. Proceedings, Part II, pp. 142–152 (2015)

  9. Souza Júnior, A.H., Corona, F., Barreto, G.A., Miche, Y., Lendasse, A.: Minimal learning machine: a novel supervised distance-based approach for regression and classification. Neurocomputing 164, 34–44 (2015)

    Article  Google Scholar 

  10. Mesquita, D.P.P., Gomes, J.P.P., Souza, Jr., A.H.: Ensemble of efficient minimal learning machines for classification and regression. Neural Process. Lett. 46(3), 751–766 (2017). http://doi.org/10.1007/s11063-017-9587-5

    Article  Google Scholar 

  11. Dasgupta, S., Littman, M.L., McAllester, D.: Pac generalization bounds for co-training. Adv. Neural Inf. Process Syst. 1, 375–382 (2002)

    Google Scholar 

  12. Brefeld, U.: Multi-view learning with dependent views. In: Proceedings of the 30th Annual ACM Symposium on Applied Computing, SAC ’15, pp. 865–870. ACM, New York, NY(2015)

  13. Park, H.S., Jun, C.H.: A simple and fast algorithm for k-medoids clustering. Expert Syst. Appl. 36(2), 3336–3341 (2009)

    Article  Google Scholar 

  14. Oliveira, A.C., Gomes, J.A.P.P., Rocha Neto, A.R., Souza, Jr., A.H.: Efficient minimal learning machines with reject option. In: Proceedings of Brazilian Conference on Intelligent Systems, Bracis ’16 (2016)

  15. Hayes, M.: Recursive least squares. Statistical Digital Signal Processing and Modeling, p. 541 (1996)

  16. Dalitz, C.: Document Image Analysis with the Gamera Framework, chap. Reject Options and Confidence Measures for kNN Classifiers, pp. 16–38. Shaker Verlag (2009)

  17. McCall, C., Reddy, K.K., Shah, M.: Macro-class selection for hierarchical k-nn classification of inertial sensor data. In: Benavente-Peces, C., Ali, F.H., Filipe, J. (eds.) PECCS, pp. 106–114. SciTePress (2012)

  18. Feger, F., Koprinska, I.: Co-training using rbf nets and different feature splits. In: Neural Networks, 2006. IJCNN’06. International Joint Conference on, pp. 1878–1885. IEEE (2006)

  19. Blum, A., Chawla, S.: Learning from labeled and unlabeled data using graph mincuts. In: Proceedings of the 18th International Conference on Machine Learning, ICML ’01, pp. 19–26 (2001)

  20. Nigam, K., McCallum, A.K., Thrun, S., Mitchell, T.: Text classification from labeled and unlabeled documents using em. Mach. Learn. 39(2–3), 103–134 (2000)

    Article  MATH  Google Scholar 

  21. Lichman, M.: UCI machine learning repository. University of California, Irvine, School of Information and Computer Sciences (2013). http://archive.ics.uci.edu/ml. Accessed 6 Mar 2016

  22. Grandvalet, Y., Bengio, Y., et al.: Semi-supervised learning by entropy minimization. NIPS 17, 529–536 (2004)

    Google Scholar 

  23. Chapelle, O., Zien, A.: Semi-supervised classification by low density separation. In: AISTATS, pp. 57–64 (2005)

  24. Data.Gov u.s. general services administration. https://www.data.gov. Accessed 31 Jan 2017

  25. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors acknowledge the support of CNPq (Grant 456837/2014-0 and research fellowship Grant 305048/2016-3).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to João P. P. Gomes.

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Caldas, W.L., Gomes, J.P.P. & Mesquita, D.P.P. Fast Co-MLM: An Efficient Semi-supervised Co-training Method Based on the Minimal Learning Machine. New Gener. Comput. 36, 41–58 (2018). https://doi.org/10.1007/s00354-017-0027-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00354-017-0027-x

Keywords

Navigation