Abstract
The past few years have witnessed the fast development of regularization methods for deep learning models such as fully-connected deep neural networks (DNNs) and convolutional neural networks (CNNs). Part of previous methods mainly consider to drop features from input data and hidden layers, such as Dropout, Cutout and DropBlocks. DropConnect select to drop connections between fully-connected layers. By randomly discard some features or connections, the above mentioned methods relieve the overfitting problem and improve the performance of neural networks. In this paper, we proposed a novel regularization methods, namely DropFilterR, for the learning of CNNs. The basic idea of DropFilterR is to relax the rule of weight-sharing in CNNs by randomly drop elements in convolution filters. Specifically, we drop different elements in convolution filters along with their moving on input feature maps. Moreover, we may apply random drop rate to further increase the randomness of the proposed method. Also, we find a suitable way to accelerate the computation for DropFilterR based on theoretical analysis. Experimental results on several widely-used image databases such as MNIST, CIFAR-10 and Pascal VOC 2012 show that using DropFilterR improves performance on image classification tasks.






Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Ba J, Frey B (2013) Adaptive dropout for training deep neural networks. In: Advances in neural information processing systems, pp 3084–3092
Deng J, Dong W, Socher R, Li LJ, Li K, Fei-Fei L (2009) Imagenet: a large-scale hierarchical image database. In: IEEE conference on computer vision and pattern recognition, 2009. CVPR 2009, IEEE, pp 248–255
Devries T, Taylor GW (2017) Improved regularization of convolutional neural networks with cutout. CoRR arXiv:1708.04552
Erhan D, Bengio Y, Courville A, Manzagol PA, Vincent P, Bengio S (2010) Why does unsupervised pre-training help deep learning? J Mach Learn Res 11(3):625–660
Everingham M, Van Gool L, Williams CKI, Winn J, Zisserman A (2010) The Pascal visual object classes (VOC) challenge. Int J Comput Vis 88(2):303–338
Ghiasi G, Lin T, Le QV (2018) Dropblock: A regularization method for convolutional networks. CoRR arXiv:1810.12890
He K, Zhang X, Ren S, Sun J (2015) Deep residual learning for image recognition. arXiv preprint arXiv:1512.03385
Hong C, Yu J, Zhang J, Jin X, Lee KH (2019) Multimodal face-pose estimation with multitask manifold deep learning. IEEE Trans Ind Inf 15(7):3952–3961
Huang G LZSDWK Sun Y (2016) Deep networks with stochastic depth. In: Leibe B, Matas J, Sebe N, Welling M (eds) Computer vision—ECCV 2016, vol 9908. Springer, Cham
Iosifidis A, Tefas A, Pitas I (2015) Dropelm: fast neural network regularization with dropout and dropconnect. Neurocomputing 162:57–66
Korchi AE, Ghanou Y (2018) Dropweak: a novel regularization method of neural networks. Proc Comput Sci 127:102–108
LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324
McClure P, Kriegeskorte N (2016) Robustly representing uncertainty in deep neural networks through sampling. arXiv preprint arXiv:1611.01639
Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556
Smirnov EA, Timoshenko DM, Andrianov SN (2014) Comparison of regularization methods for imagenet classification with deep convolutional neural networks. Aasri Proc 6(1):89–94
Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958
Tian ZCJNQ (2018) Dropfilter: dropout for convolutions. arXiv preprint arXiv:1810.09849
Tompson J, Goroshin R, Jain A, LeCun Y, Bregler C (2015) Efficient object localization using convolutional networks. In: 2015 IEEE conference on computer vision and pattern recognition (CVPR), pp 648–656
Vedaldi A, Lenc K (2014) Matconvnet—convolutional neural networks for matlab. CoRR arXiv:1412.4564
Wager S, Wang S, Liang PS (2013) Dropout training as adaptive regularization. In: Advances in neural information processing systems, pp 351–359
Wan L, Zeiler M, Zhang S, LeCun Y, Fergus R (2013) Regularization of neural networks using dropconnect. In: Proceedings of the 30th international conference on machine learning (ICML-13), pp 1058–1066
Yang N, Tang H, Sun H, Yang X (2018) Dropband: a simple and effective method for promoting the scene classification accuracy of convolutional neural networks for vhr remote sensing imagery. IEEE Geosci Remote Sens Lett PP(99):1–5
Yao Y, Rosasco L, Caponnetto A (2007) On early stopping in gradient descent learning. Constr Approx 26(2):289–315
Yu J, Rui Y, Tao D (2014) Click prediction for web image reranking using multimodal sparse coding. IEEE Trans Image Process 23(5):2019–2032
Yu J, Yang X, Gao F, Tao D (2016) Deep multimodal distance metric learning using click constraints for image ranking. IEEE Trans Cybern 47(12):4014–4024
Yu J, Tan M, Zhang H, Tao D, Rui Y (2019) Hierarchical deep click feature prediction for fine-grained image recognition. IEEE Trans Pattern Anal Mach Intell. https://doi.org/10.1109/TPAMI.2019.2932058
Yu J, Zhu C, Zhang J, Huang Q, Tao D (2019) Spatial pyramid-enhanced netvlad with weighted triplet loss for place recognition. IEEE Trans Neural Netw Learn Syst. https://doi.org/10.1109/TNNLS.2019.2908982
Zhang J, Yu J, Tao D (2018) Local deep-feature alignment for unsupervised dimension reduction. IEEE Trans Image Process 27(5):2420–2432
Zhong Z, Zheng L, Kang G, Li S, Yang Y (2017) Random erasing data augmentation. arXiv:1708.04896
Funding
Funding was provided by National Key Research and Development Program of China (Grant No. 2018YFB1003405).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Pan, H., Niu, X., Li, R. et al. DropFilterR: A Novel Regularization Method for Learning Convolutional Neural Networks. Neural Process Lett 51, 1285–1298 (2020). https://doi.org/10.1007/s11063-019-10147-0
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-019-10147-0