Skip to main content

Smoothing and Transition Matrices Estimation to Learn with Noisy Labels

  • Conference paper
  • First Online:
Image Analysis and Processing – ICIAP 2023 (ICIAP 2023)

Abstract

In recent years, there has been impressive progress in learning with noisy labels, particularly in leveraging a small set of clean data. Meta-learning-based label correction techniques have further advanced performance by correcting noisy labels during training. However, these methods require multiple back-propagation steps, which considerably slows down the training process. Alternatively, some researchers have attempted to estimate the label transition matrix on-the-fly to address the issue of noisy labels. These approaches are more robust and faster than meta-learning-based techniques. The use of the transition matrix makes the classifier skeptical about all corrected samples, thereby mitigating the problem of label noise. We propose a novel three-head architecture that can efficiently estimate the label transition matrix and two new label smoothing matrices at each iteration. Our approach enables the estimated matrices to closely follow the shifting noise and reduce over-confidence on classes during classifier model training. We report extensive experiments on synthetic and real world noisy datasets, achieving state of the art performance on synthetic variants of CIFAR-10/100 and on the challenging Clothing1M datasets. Code at https://github.com/z3n0e/STM.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Azadi, S., Feng, J., Jegelka, S., Darrell, T.: Auxiliary image regularization for deep CNNs with noisy labels. arXiv preprint arXiv:1511.07069 (2015)

  2. Bahri, D., Jiang, H., Gupta, M.: Deep K-NN for noisy labels. In: International Conference on Machine Learning, pp. 540–550. PMLR (2020)

    Google Scholar 

  3. Chen, P., Liao, B.B., Chen, G., Zhang, S.: Understanding and utilizing deep neural networks trained with noisy labels. In: International Conference on Machine Learning, pp. 1062–1070. PMLR (2019)

    Google Scholar 

  4. Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Fei-Fei, L.: ImageNet: a large-scale hierarchical image database. In: 2009 IEEE Conference on Computer Vision and Pattern Recognition, pp. 248–255. IEEE (2009)

    Google Scholar 

  5. Finn, C., Abbeel, P., Levine, S.: Model-agnostic meta-learning for fast adaptation of deep networks. In: International Conference on Machine Learning, pp. 1126–1135. PMLR (2017)

    Google Scholar 

  6. Goldberger, J., Ben-Reuven, E.: Training deep neural-networks using a noise adaptation layer (2016)

    Google Scholar 

  7. Guo, J., Gong, M., Liu, T., Zhang, K., Tao, D.: LTF: a label transformation framework for correcting label shift. In: International Conference on Machine Learning, pp. 3843–3853. PMLR (2020)

    Google Scholar 

  8. Han, J., Luo, P., Wang, X.: Deep self-learning from noisy labels. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 5138–5147 (2019)

    Google Scholar 

  9. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

  10. Hendrycks, D., Mazeika, M., Wilson, D., Gimpel, K.: Using trusted data to train deep networks on labels corrupted by severe noise. In: Advances in Neural Information Processing Systems, vol. 31 (2018)

    Google Scholar 

  11. Hu, W., Li, Z., Yu, D.: Simple and effective regularization methods for training on noisily labeled data with generalization guarantee. arXiv preprint arXiv:1905.11368 (2019)

  12. Jang, E., Gu, S., Poole, B.: Categorical reparameterization with Gumbel-Softmax. arXiv preprint arXiv:1611.01144 (2016)

  13. Jiang, L., Zhou, Z., Leung, T., Lif, L.J., Fei-Fei, L.: MentorNet: learning data-driven curriculum for very deep neural networks on corrupted labels. In: International Conference on Machine Learning, pp. 2304–2313. PMLR (2018)

    Google Scholar 

  14. Krizhevsky, A., Hinton, G., et al.: Learning multiple layers of features from tiny images (2009)

    Google Scholar 

  15. Kye, S.M., Choi, K., Yi, J., Chang, B.: Learning with noisy labels by efficient transition matrix estimation to combat label miscorrection. In: Avidan, S., Brostow, G., Cissé, M., Farinella, G.M., Hassner, T. (eds.) Computer Vision-ECCV 2022: 17th European Conference, Tel Aviv, Israel, 23–27 October 2022, Proceedings, Part XXV, vol. 13685, pp. 717–738. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-19806-9_41

  16. Li, J., Socher, R., Hoi, S.C.: DivideMix: learning with noisy labels as semi-supervised learning. arXiv preprint arXiv:2002.07394 (2020)

  17. Mirzasoleiman, B., Cao, K., Leskovec, J.: Coresets for robust training of deep neural networks against noisy labels. In: Advances in Neural Information Processing Systems, vol. 33 (2020)

    Google Scholar 

  18. Nishi, K., Ding, Y., Rich, A., Höllerer, T.: Augmentation strategies for learning with noisy labels. arXiv preprint arXiv:2103.02130 (2021)

  19. Patrini, G., Nielsen, F., Nock, R., Carioni, M.: Loss factorization, weakly supervised learning and label noise robustness. In: International Conference on Machine Learning, pp. 708–717. PMLR (2016)

    Google Scholar 

  20. Patrini, G., Rozza, A., Krishna Menon, A., Nock, R., Qu, L.: Making deep neural networks robust to label noise: a loss correction approach. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1944–1952 (2017)

    Google Scholar 

  21. Radford, A., et al.: Learning transferable visual models from natural language supervision. In: International Conference on Machine Learning, pp. 8748–8763. PMLR (2021)

    Google Scholar 

  22. Ren, M., Zeng, W., Yang, B., Urtasun, R.: Learning to reweight examples for robust deep learning. In: International Conference on Machine Learning, pp. 4334–4343. PMLR (2018)

    Google Scholar 

  23. Ricci, S., Uricchio, T., Bimbo, A.D.: Meta-learning advisor networks for long-tail and noisy labels in social image classification. ACM Trans. Multimedia Comput. Commun. Appl. 19(5s), 1–23 (2023)

    Google Scholar 

  24. Shu, J., et al.: Meta-weight-net: learning an explicit mapping for sample weighting. arXiv preprint arXiv:1902.07379 (2019)

  25. Song, H., Kim, M., Lee, J.G.: SELFIE: refurbishing unclean samples for robust deep learning. In: International Conference on Machine Learning, pp. 5907–5915. PMLR (2019)

    Google Scholar 

  26. Sukhbaatar, S., Bruna, J., Paluri, M., Bourdev, L., Fergus, R.: Training convolutional networks with noisy labels. arXiv preprint arXiv:1406.2080 (2014)

  27. Taigman, Y., Yang, M., Ranzato, M., Wolf, L.: DeepFace: closing the gap to human-level performance in face verification. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1701–1708 (2014)

    Google Scholar 

  28. Wang, Y., Ma, X., Chen, Z., Luo, Y., Yi, J., Bailey, J.: Symmetric cross entropy for robust learning with noisy labels. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 322–330 (2019)

    Google Scholar 

  29. Wang, Z., Hu, G., Hu, Q.: Training noise-robust deep neural networks via meta-learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 4524–4533 (2020)

    Google Scholar 

  30. Wu, Y., Shu, J., Xie, Q., Zhao, Q., Meng, D.: Learning to purify noisy labels via meta soft label corrector. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 10388–10396 (2021)

    Google Scholar 

  31. Xia, X., et al.: Are anchor points really indispensable in label-noise learning? arXiv preprint arXiv:1906.00189 (2019)

  32. Xiao, T., Xia, T., Yang, Y., Huang, C., Wang, X.: Learning from massive noisy labeled data for image classification. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2691–2699 (2015)

    Google Scholar 

  33. Xu, Y., Zhu, L., Jiang, L., Yang, Y.: Faster meta update strategy for noise-robust deep learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 144–153 (2021)

    Google Scholar 

  34. Yao, J., Wu, H., Zhang, Y., Tsang, I.W., Sun, J.: Safeguarded dynamic label regression for noisy supervision. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, pp. 9103–9110 (2019)

    Google Scholar 

  35. Yao, Y., et al.: Dual T: reducing estimation error for transition matrix in label-noise learning. arXiv preprint arXiv:2006.07805 (2020)

  36. Zhang, H., Cisse, M., Dauphin, Y.N., Lopez-Paz, D.: Mixup: beyond empirical risk minimization. arXiv preprint arXiv:1710.09412 (2017)

  37. Zhang, X., Wu, X., Chen, F., Zhao, L., Lu, C.T.: Self-paced robust learning for leveraging clean labels in noisy data. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 6853–6860 (2020)

    Google Scholar 

  38. Zheng, G., Awadallah, A.H., Dumais, S.: Meta label correction for noisy label learning. In: Proceedings of the 35th AAAI Conference on Artificial Intelligence (2021)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Simone Ricci .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ricci, S., Uricchio, T., Del Bimbo, A. (2023). Smoothing and Transition Matrices Estimation to Learn with Noisy Labels. In: Foresti, G.L., Fusiello, A., Hancock, E. (eds) Image Analysis and Processing – ICIAP 2023. ICIAP 2023. Lecture Notes in Computer Science, vol 14233. Springer, Cham. https://doi.org/10.1007/978-3-031-43148-7_38

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-43148-7_38

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-43147-0

  • Online ISBN: 978-3-031-43148-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics