Abstract
One of the common problems of neural networks, especially those with many layers, consists of their lengthy training time. We attempt to solve this problem at the algorithmic level, proposing a simple parallel design which is inspired by the parallel circuits found in the human retina. To avoid large matrix calculations, we split the original network vertically into parallel circuits and let the backpropagation algorithm flow in each subnetwork independently. Experimental results have shown the speed advantage of the proposed approach but also point out that this advantage is affected by multiple dependencies. The results also suggest that parallel circuits improve the generalization ability of neural networks presumably due to automatic problem decomposition. By studying network sparsity, we partly justified this theory and proposed a potential method for improving the design.





Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Pinter, P.J., Hatfield, J.L., Schepers, J.S., Barnes, E.M., Moran, M.S., Daughtry, C.S.T., Upchurch, D.R.: Remote sensing for crop management. Photogramm. Eng. Remote Sens. 69(6), 647–664 (2003). doi:10.14358/pers.69.6.647
Tsai, F., Chou, M.-J.: Texture augmented analysis of high resolution satellite imagery in detecting invasive plant species. J. Chin. Inst. Eng. 29(4), 581–592 (2006). doi:10.1080/02533839.2006.9671155
Ma, Y., Wu, H., Wang, L., Huang, B., Ranjan, R., Zomaya, A., Jie, W.: Remote sensing big data computing: challenges and opportunities. Future Gener. Comput. Syst. 51, 47–60 (2015). doi:10.1016/j.future.2014.10.029
Mo, D.: A survey on deep learning: one small step toward AI, pp. 1–16 (2012). http://www.cs.unm.edu/~pdevineni/papers/Mo.pdf
Suresh, S., Omkar, S.N., Mani, V.: Parallel implementation of back-propagation algorithm in networks of workstations. IEEE Trans. Parallel Distrib. Syst. 16(1), 24–34 (2005). doi:10.1109/tpds.2005.11
Gollisch, T., Meister, M.: Eye smarter than scientists believed: neural computations in circuits of the retina. Neuron 65(2), 150–164 (2010). doi:10.1016/j.neuron.2009.12.009
Redies, C., Puelles, L.: Modularity in vertebrate brain development and evolution. BioEssays: News Rev. Mol. Cell. Dev. Biol. 23(12), 1100–1111 (2001). doi:10.1002/bies.10014
de Garis, H.: An artificial brain ATR’s CAM-brain project aims to build/evolve an artificial brain with a million neural net modules inside a trillion cell cellular automata machine. New Gener. Comput. 12(2), 215–221 (1994). doi:10.1007/bf03037343
Matsugu, M., Mori, K., Mitari, Y., Kaneda, Y.: Subject independent facial expression recognition with robust face detection using a convolutional neural network. Neural Netw. 16(5–6), 555–559 (2003). doi:10.1016/s0893-6080(03)00115-1
Kashtan, N., Alon, U.: Spontaneous evolution of modularity and network motifs. Proc. Natl. Acad. Sci. USA 102(39), 13773–13778 (2005). doi:10.1073/pnas.0503610102
Tsrresen, J.: Parallelization of backpropagation training for feed-forward neural networks. Doctoral dissertation, The Norwegian Institute of Technology (1996)
Dahl, G., McAvinney, A., Newhall, T.: Parallelizing neural network training for cluster systems. In: Proceedings of the IASTED International Conference on Parallel and Distributed Computing and Networks, pp. 220–225 (2008)
Castellano, G., Fanelli, A.M., Pelillo, M.: An iterative pruning algorithm for feedforward neural networks. IEEE Trans. Neural Netw. 8(3), 519–531 (1997). doi:10.1109/72.572092
Signorini, D.F., Slattery, J.M.: Learning both weights and connections for efficient neural networks. Lancet 346(8988), 1500–1500 (1995). doi:10.1016/s0140-6736(95)92525-2
Huang, G.B., Saratchandran, P., Sundararajan, N.: A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation. IEEE Trans. Neural Netw. 16(1), 57–67 (2005). doi:10.1109/tnn.2004.836241
LeCun, Y., Bengio, Y.: Convolutional networks for images, speech, and time series. In: Michael, A.A. (ed.) The Handbook of Brain Theory and Neural Networks, pp. 255–258. MIT Press, Cambridge (1998)
Khare, V.R., Xin, Y., Sendhoff, B., Yaochu, J., Wersing, H.: Co-evolutionary modular neural networks for automatic problem decomposition. IEEE Congr. Evol. Comput. 3, 2691–2698 (2005). doi:10.1109/cec.2005.1555032
Mitra, V., Wang, C.J., Banerjee, S.: Lidar detection of underwater objects using a neuro-SVM-based architecture. IEEE Trans. Neural Netw. 17(3), 717–731 (2006). doi:10.1109/tnn.2006.873279
Schmidhuber, J.: Multi-column deep neural networks for image classification. In: Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3642–3649 (2012)
Lichman, M.: UCI machine learning repository. University of California, School of Information and Computer Science, Irvine, CA (2013). http://archive.ics.uci.edu/ml
Wan, L., Zeiler, M.: Regularization of neural networks using dropconnect. In: Proceedings of the 30th International Conference on Machine Learning (ICML-13)(1), pp. 109–111 (2013)
Baldi, P., Sadowski, P.: The dropout learning algorithm. Artif. Intell. 210(1), 78–122 (2014). doi:10.1016/j.artint.2014.02.004
Author information
Authors and Affiliations
Corresponding author
Additional information
The project is sponsored by the Crop for the Future Research Centre.
Appendix
Appendix
See Tables 4, 5, 6, 7, 8 and 9.
Rights and permissions
About this article
Cite this article
Phan, K.T., Maul, T.H. & Vu, T.T. An Empirical Study on Improving the Speed and Generalization of Neural Networks Using a Parallel Circuit Approach. Int J Parallel Prog 45, 780–796 (2017). https://doi.org/10.1007/s10766-016-0435-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10766-016-0435-4