Loading [MathJax]/extensions/MathMenu.js
Cyclic Annealing Training Convolutional Neural Networks for Image Classification with Noisy Labels | IEEE Conference Publication | IEEE Xplore

Cyclic Annealing Training Convolutional Neural Networks for Image Classification with Noisy Labels


Abstract:

Noisy labels modeling makes a convolutional neural network (CNN) more robust for the image classification problem. However, current noisy labels modeling methods usually ...Show More

Abstract:

Noisy labels modeling makes a convolutional neural network (CNN) more robust for the image classification problem. However, current noisy labels modeling methods usually require an expectation-maximization (EM) based procedure to optimize the parameters, which is computationally expensive. In this paper, we utilize a fast annealing training method to speed up the CNN training in every M-step. Since the training is repeated executed along the entire EM optimization path and obtain many local minimal CNN models from every training cycle, we name it as the Cyclic Annealing Training (CAT) approach. In addition to reducing the training time, CAT can further bagging all the local minimal CNN models at the test time to improve the performance of classification. We evaluate the proposed method on several image classification datasets with different noisy labels patterns, and the results show that our CAT approach outperforms state-of-the-art noisy labels modeling methods.
Date of Conference: 07-10 October 2018
Date Added to IEEE Xplore: 06 September 2018
ISBN Information:
Electronic ISSN: 2381-8549
Conference Location: Athens, Greece

Contact IEEE to Subscribe

References

References is not available for this document.