Skip to main content

Label Noise Detection Based on Tri-training

  • Conference paper
  • First Online:
Cloud Computing and Security (ICCCS 2018)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11063))

Included in the following conference series:

Abstract

In machine learning, noise contained in the training dataset can be divided into attribute noise and label noise. Many works prove that label noise is more harmful compared to attribute noise. A set of noise filtering algorithms have been proposed to identify and remove noise prior to learning. However, almost all existing works solve this problem in a pure supervised way. It means noise identification is only based on the information of labeled data. In fact, unlabeled data are available in many applications, and the amount of unlabeled data is usually much bigger than labeled data. Therefore, in this paper, we consider to make use of unlabeled data to improve the performance of noise filtering. Tri-training is a powerful semi-supervised learning algorithm. It is adopted in this work because it is independent in the view of data. Finally, a set of experiments are conducted to prove the effectiveness of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zhu, X., Wu, X.: Class noise vs. attribute noise: a quantitative study. Artif. Intell. Rev. 22(3), 177–210 (2004)

    Article  MathSciNet  Google Scholar 

  2. Quinlan, J.R.: Induction of decision trees. Mach. Learn. 1(1), 81–106 (1986)

    Google Scholar 

  3. Brodley, C.E., Friedl, M.A.: Identifying mislabeled training data. J. Artif. Intell. Res. 11, 131–167 (1999)

    Article  Google Scholar 

  4. Gamberger, D., Lavrač, N., Džeroski, S.: Noise elimination in inductive concept learning: a case study in medical diagnosis. In: Arikawa, S., Sharma, A.K. (eds.) ALT 1996. LNCS, vol. 1160, pp. 199–212. Springer, Heidelberg (1996). https://doi.org/10.1007/3-540-61863-5_47

    Chapter  MATH  Google Scholar 

  5. Gamberger, D., Lavrac, N., Dzeroski, S.: Noise detection and elimination in data preprocessing: experiments in medical domains. Appl. Artif. Intell. 14(2), 205–223 (2000)

    Article  Google Scholar 

  6. Rico-Juan, J.R., Inesta, J.M.: Adaptive training set reduction for nearest neighbor classification. Neurocomputing 138, 316–324 (2014)

    Article  Google Scholar 

  7. Calvo-Zaragoza, J., Valero-Mas, J.J., Rico-Juan, J.R.: Improving kNN multi-label classification in Prototype Selection scenarios using class proposals. Pattern Recogn. 48(5), 1608–1622 (2015)

    Article  Google Scholar 

  8. Kanj, S., Abdallah, F., Denoeux, T., Tout, K.: Editing training data for multi-label classification with the k-nearest neighbor rule. Pattern Anal. Appl. 19(1), 145–161 (2015)

    Article  MathSciNet  Google Scholar 

  9. Roli, F.: Multiple classifier systems. In: Li, S.Z., Jain, A.K. (eds.) Encyclopedia of Biometrics, pp. 1142–1147. Springer, New York (2015). https://doi.org/10.1007/978-1-4899-7488-4

    Chapter  Google Scholar 

  10. Wozniak, M., Grana, M., Corchado, E.: A survey of multiple classifier systems as hybrid systems. Inf. Fusion 16, 3–17 (2014)

    Article  Google Scholar 

  11. Kuncheva, L.I., Rodriguez, J.J.: A weighted voting framework for classifiers ensembles. Knowl. Inf. Syst. 38(2), 259–275 (2014)

    Article  Google Scholar 

  12. Sun, S.: Local within-class accuracies for weighting individual outputs in multiple classifier systems. Pattern Recogn. Lett. 31(2), 119–124 (2010)

    Article  Google Scholar 

  13. Saez, J.A., Galar, M., Luengo, J., Herrera, F.: Tackling the problem of classification with noisy data using Multiple Classifier Systems: analysis of the performance and robustness. Inf. Sci. 247, 1–20 (2013)

    Article  Google Scholar 

  14. Zhao, Y.C., Li, W.Z., Wu, J., Lu, S.L.: In: 2015 IEEE Conference on Computer Communications, pp. 26–30. IEEE, Hong Kong (2015)

    Google Scholar 

  15. Zhao, Y.C., Li, W.Z., Lu, S.L.: J. Netw. Comput. Appl. 74, 11–20 (2016)

    Google Scholar 

  16. Saez, J.A., Galar, M., Luengo, J., Herrera, F.: Analyzing the presence of noise in multi-class problems: alleviating its influence with the one-vs-one decomposition. Knowl. Inf. Syst. 38(1), 179–206 (2014)

    Article  Google Scholar 

  17. Barandela, R., Valdovinos, R.M., Sanchez, J.S.: New applications of ensembles of classifiers. Pattern Anal. Appl. 6(3), 245–256 (2003)

    Article  MathSciNet  Google Scholar 

  18. Sanchez, J.S., Kuncheva, L.I.: Data reduction using classifier ensembles. In: ESANN, pp. 379–384 (2007)

    Google Scholar 

  19. Zhou, Z.H., Li, M.: Tri-training: exploiting unlabeled data using three classifiers. IEEE Trans. Knowl. Data Eng. 17(11), 1529–1541 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongbin Zhu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhu, H., Liu, J., Wan, M. (2018). Label Noise Detection Based on Tri-training. In: Sun, X., Pan, Z., Bertino, E. (eds) Cloud Computing and Security. ICCCS 2018. Lecture Notes in Computer Science(), vol 11063. Springer, Cham. https://doi.org/10.1007/978-3-030-00006-6_56

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-00006-6_56

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-00005-9

  • Online ISBN: 978-3-030-00006-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics