Abstract
We propose a novel dropout technique for convolutional neural networks by redesigned Dropout and DropConnect methods. Conventional drop methods work on the individual single weight value of the fully connected network. When they are applied to convolution layers, only some kernel weights are removed. However, all the weights of the convolutional kernel windows together constitute a specific pattern, so dropping part of kernel window weights may cause change of the learned patterns and may model completely different local patterns. We assign the basic unit of drop method for convolutional weights to be the whole kernel windows, so one output map value is dropped. We evaluated the proposed DropKernel strategy by the object classification performance on CIFAR10 in comparison to conventional Dropout and DropConnect methods, and showed improved performance of the proposed method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Srivastava, N., Hinton, G.E., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)
Wan, L., Zeiler, M., Zhang, S., Cun, Y. L., Fergus, R.: Regularization of neural networks using DropConnect. In: the 30th International Conference on Machine Learning (ICML 2013), pp. 1058–1066 (2013)
Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)
Zagoruyko, S., Komodakis, N.: Wide residual networks. arXiv:1605.07146 (2016)
Ioffe, S., Szegedy, C.: Batch normalization: accelerating deep network training by reducing internal covariate shift. In: the 32nd International Conference on Machine Learning (ICML 2015) (2015)
Abadi, M., Agarwal, A., Barham, P., et al.: TensorFlow: large-scale machine learning on heterogeneous distributed systems. arXiv:1603.04467 (2016)
Acknowledgments
This work was supported by Institute for Information and communications Technology Promotion (IITP) grant funded by the Korea government (MSIT) (No. R7124-16-0004, Development of Intelligent Interaction Technology Based on Context Awareness and Human Intention Understanding, 50%) and by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP) (No. NRF-2017M3C1B6071400).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Lee, S., Jang, GJ. (2019). Stochastic Drop of Kernel Windows for Improved Generalization in Convolution Neural Networks. In: Karwowski, W., Ahram, T. (eds) Intelligent Human Systems Integration 2019. IHSI 2019. Advances in Intelligent Systems and Computing, vol 903. Springer, Cham. https://doi.org/10.1007/978-3-030-11051-2_34
Download citation
DOI: https://doi.org/10.1007/978-3-030-11051-2_34
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-11050-5
Online ISBN: 978-3-030-11051-2
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)