skip to main content
10.1145/3589572.3589596acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmvaConference Proceedingsconference-collections
research-article

An Efficient Noisy Label Learning Method with Semi-supervised Learning: An Efficient Noisy Label Learning Method with Semi-supervised Learning

Published: 09 June 2023 Publication History

Abstract

Even though deep learning models make success in many application areas, it is well-known that they are vulnerable to data noise. Therefore, researches on a model that detects and removes noisy data or the one that operates robustly against noisy data have been actively conducted. However, most existing approaches have limitations in either that important information could be left out while noisy data are cleaned up or that prior information on the dataset is required while such information may not be easily available. In this paper, we propose an effective semi-supervised learning method with model ensemble and parameter scheduling techniques. Our experiment results show that the proposed method achieves the best accuracy under 20% and 40% noise-ratio conditions. The proposed model is robust to data noise, suffering from only 2.08% of accuracy degradation when the noise ratio increases from 20% to 60% on CIFAR-10. We additionally perform an ablation study to verify net accuracy enhancement by applying one technique after another.

References

[1]
Zhang, C., Bengio, S., Hardt, M., Mozer, M. C. and Singer, Y. Identity crisis: Memorization and generalization under extreme overparameterization. arXiv preprint arXiv:1902.04698 (2019).
[2]
Yao, Q., Yang, H., Han, B., Niu, G. and Kwok, J. T.-Y. Searching to exploit memorization effect in learning with noisy labels. PMLR, City, 2020.
[3]
Song, H., Kim, M., Park, D., Shin, Y. and Lee, J.-G. Learning from noisy labels with deep neural networks: A survey. IEEE Transactions on Neural Networks and Learning Systems (2022).
[4]
Patrini, G., Rozza, A., Krishna Menon, A., Nock, R. and Qu, L. Making deep neural networks robust to label noise: A loss correction approach. City, 2017.
[5]
Cook, L. and Friend, M. Co-teaching: Guidelines for creating effective practices. Focus on exceptional children, 28 (1995).
[6]
Huang, J., Qu, L., Jia, R. and Zhao, B. O2u-net: A simple noisy label detection approach for deep neural networks. City, 2019.
[7]
Li, J., Socher, R. and Hoi, S. C. Dividemix: Learning with noisy labels as semi-supervised learning. arXiv preprint arXiv:2002.07394 (2020).
[8]
Zhou, T., Wang, S. and Bilmes, J. Robust curriculum learning: from clean label detection to noisy label self-correction. City, 2020.
[9]
Van Engelen, J. E. and Hoos, H. H. A survey on semi-supervised learning. Machine Learning, 109, 2 (2020), 373-440.
[10]
Berthelot, D., Carlini, N., Goodfellow, I., Papernot, N., Oliver, A. and Raffel, C. A. Mixmatch: A holistic approach to semi-supervised learning. Advances in neural information processing systems, 32 (2019).
[11]
Zhang, H., Cisse, M., Dauphin, Y. N. and Lopez-Paz, D. mixup: Beyond empirical risk minimization. arXiv preprint arXiv:1710.09412 (2017).
[12]
Yang, X., Song, Z., King, I. and Xu, Z. A survey on deep semi-supervised learning. arXiv preprint arXiv:2103.00550 (2021).
[13]
Zhai, X., Oliver, A., Kolesnikov, A. and Beyer, L. S4l: Self-supervised semi-supervised learning. City, 2019.
[14]
Blum, A. and Mitchell, T. Combining labeled and unlabeled data with co-training. City, 1998.
[15]
Sohn, K., Berthelot, D., Carlini, N., Zhang, Z., Zhang, H., Raffel, C. A., Cubuk, E. D., Kurakin, A. and Li, C.-L. Fixmatch: Simplifying semi-supervised learning with consistency and confidence. Advances in neural information processing systems, 33 (2020), 596-608.
[16]
He, K., Zhang, X., Ren, S. and Sun, J. Deep residual learning for image recognition. City, 2016.
[17]
Jiang, L., Zhou, Z., Leung, T., Li, L.-J. and Fei-Fei, L. Mentornet: Learning data-driven curriculum for very deep neural networks on corrupted labels. PMLR, City, 2018.
[18]
Arazo, E., Ortego, D., Albert, P., O'Connor, N. and McGuinness, K. Unsupervised label noise modeling and loss correction. PMLR, City, 2019.
[19]
Jiang, L., Huang, D., Liu, M. and Yang, W. Beyond synthetic noise: Deep learning on controlled noisy labels. PMLR, City, 2020.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICMVA '23: Proceedings of the 2023 6th International Conference on Machine Vision and Applications
March 2023
193 pages
ISBN:9781450399531
DOI:10.1145/3589572
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 June 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Classification
  2. Noisy data
  3. Noisy label
  4. Semi-supervised learning

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ICMVA 2023

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 35
    Total Downloads
  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)1
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media