Skip to main content
Log in

A regeneratable dynamic differential evolution algorithm for neural networks with integer weights

  • Published:
Journal of Zhejiang University SCIENCE C Aims and scope Submit manuscript

Abstract

Neural networks with integer weights are more suited for embedded systems and hardware implementations than those with real weights. However, many learning algorithms, which have been proposed for training neural networks with float weights, are inefficient and difficult to train for neural networks with integer weights. In this paper, a novel regeneratable dynamic differential evolution algorithm (RDDE) is presented. This algorithm is efficient for training networks with integer weights. In comparison with the conventional differential evolution algorithm (DE), RDDE has introduced three new strategies: (1) A regeneratable strategy is introduced to ensure further evolution, when all the individuals are the same after several iterations such that they cannot evolve further. In other words, there is an escape from the local minima. (2) A dynamic strategy is designed to speed up convergence and simplify the algorithm by updating its population dynamically. (3) A local greedy strategy is introduced to improve local searching ability when the population approaches the global optimal solution. In comparison with other gradient based algorithms, RDDE does not need the gradient information, which has been the main obstacle for training networks with integer weights. The experiment results show that RDDE can train integer-weight networks more efficiently.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Alibeik, S.A., Nemati, F., Sharif-Bakhtiar, M., 1995. Analog Feedforward Neural Networks with Very Low Precision Weights. IEEE Int. Conf. on Neural Networks, p.90–94. [doi:10.1109/ICNN.1995.487908]

  • Anguita, D., Gomes, B.A., 1996. Mixing floating- and fixed-point formats for neural network learning on neuroprocessors. Microprocess. & Microprogr., 41(10): 757–769. [doi:10.1016/0165-6074(96)00012-9]

    Article  Google Scholar 

  • Babri, H.A., Chen, Y.Q., Yin, T., 1998. Improving backpropagation learning under limited precision. Pattern Recogn. Lett., 19(11):1007–1016. [doi:10.1016/S0167-8655(98)00081-6]

    Article  MATH  Google Scholar 

  • Bao, J., Zhou, B., Yan, Y., 2009. A Genetic-Algorithm-Based Weight Discretization Paradigm for Neural Networks. WRI World Conf. on Computer Science and Information Engineering, p.655–659. [doi:10.1109/CSIE.2009.601]

  • Basturk, A., Gunay, E., 2009. Efficient edge detection in digital images using a cellular neural network optimized by differential evolution algorithm. Exp. Syst. Appl., 36(2):2645–2650. [doi:10.1016/j.eswa.2008.01.082]

    Article  Google Scholar 

  • Behan, T., Liao, Z., Zhao, L., Yang, C.T., 2008. Accelerating Integer Neural Networks on Low Cost DSPs. Proc. Int. Conf. on Intelligent Systems, p.1270–1273.

  • Draghici, S., 2002. On the capabilities of neural networks using limited precision weights. Neur. Networks, 15(3): 395–414. [doi:10.1016/S0893-6080(02)00032-1]

    Article  Google Scholar 

  • Fukushima, K., Wake, N., 1991. Handwritten alphanumeric character recognition by the neocognitron. IEEE Trans. Neur. Networks, 2(3):355–365. [doi:10.1109/72.97912]

    Article  Google Scholar 

  • Hagan, M.T., Menhaj, M.B., 1994. Training feedforward networks with the Marquardt algorithm. IEEE Trans. Neur. Networks, 5(6):989–993. [doi:10.1109/72.329697]

    Article  Google Scholar 

  • Holmstrom, L., Koistinen, P., 1992. Using additive noise in back-propagation training. IEEE Trans. Neur. Networks, 3(1):24–38. [doi:10.1109/72.105415]

    Article  Google Scholar 

  • Ilonen, J., Kamarainen, J.K., Lampinen, J., 2003. Differential evolution training algorithm for feed-froward neural networks. Neur. Process. Lett., 17(1):93–105. [doi:10.1023/A:1022995128597]

    Article  Google Scholar 

  • Kamio, T., Tanaka, S., Morisue, M., 2000. Backpropagation Algorithm for Logic Oriented Neural Networks. Proc. IEEE-INNS-ENNS Int. Joint Conf. on Neural Networks, 2:123–128. [doi:10.1109/IJCNN.2000.857885]

    Google Scholar 

  • Khan, A.H., Hines, E.L., 1994. Integer-weight nueral nets. Electron. Lett., 30(15):1237–1238. [doi:10.1049/el:19940817]

    Article  Google Scholar 

  • Khan, A.H., Wilson, R.G., 1996. Integer-Weight Approximation of Continuous-Weight Multilayer Feedforward Nets. IEEE Int. Conf. on Neural Networks, p.392–397. [doi:10.1109/ICNN.1996.548924]

  • Marchesi, M., Orlandi, G., Piazza, F., Pollonara, L., Uncini, A., 1990. Multi-layer Perceptrons with Discrete Weights. Int. Joint Conf. on Neural Networks, 2:623–630. [doi:10.1109/IJCNN.1990.137772]

    Article  Google Scholar 

  • Nejadgholi, I., Seyyedsalehi, S.A., 2007. Nonlinear normalization of input patterns to speaker variability in speech recognition neural networks. Neur. Comput. Appl., 18(1):45–55. [doi:10.1007/S00521-007-0151-5]

    Article  Google Scholar 

  • Phansalkar, V.V., Sastry, P.S., 1994. Analysis of the back-propagation algorithm with momentum. IEEE Trans. Neur. Networks, 5(3):505–506. [doi:10.1109/72.286925]

    Article  Google Scholar 

  • Plagianakos, V.P., Vrahatis, M.N., 1999. Neural Network Training with Constrained Integer Weights. Proc. Conf. on Evolutionary Computation, 3:2007–2013. [doi:10.1109/CEC.1999.785521]

    Google Scholar 

  • Plagianakos, V.P., Vrahatis, M.N., 2002. Parallel evolutionary training algorithms for “hardware-friendly” neural networks. Nat. Comput., 1(2–3):307–322. [doi:10.1023/A: 1016545907026]

    Article  MATH  MathSciNet  Google Scholar 

  • Qing, A., 2006. Dynamic differential evolution strategy and applications in electromagnetic inverse scattering problems. IEEE Trans. Geosci. Remote Sens., 44(1):116–125. [doi:10.1109/TGRS.2005.859347]

    Article  Google Scholar 

  • Robert, C., Gaudy, J.F., Limoge, A., 2002. Electroence-phalogram processing using neural networks. Clin. Neurophysiol., 113(5):694–701. [doi:10.1016/S1388-2457(02)00033-0]

    Article  Google Scholar 

  • Rumelhart, D.E., McClelland, J.L., 1986. Paralle Distributed Processing: Explorations in the Microstructure of Cognition. Vol. 1: Fundations. MIT Press, Cambridge, MA, USA.

    Google Scholar 

  • Rumelhart, D.E., Hinton, G.E., Williams, R.J., 1986. Learning representations by back-propagating errors. Nature, 323:533–536. [doi:10.1038/323533a0]

    Article  Google Scholar 

  • Slowik, A., Bialko, M., 2008. Training of Artificial Neural Networks Using Differential Evolution Algorithm. Conf. on Human System Interactions, p.60–65. [doi:10.1109/HSI.2008.4581409]

  • Storn, R., Price, K., 1997. Differential evolution—a simple and efficient heuristic for global optimization over continuous space. J. Glob. Optim., 11(4):341–359. [doi:10.1023/A:1008202821328]

    Article  MATH  MathSciNet  Google Scholar 

  • Woodland, P.C., 1989. Weight Limiting, Weight Quantisation and Generalization in Multi-layer Perceptrons. First IEE Int. Conf. on Artificial Neural Networks, p.297–300.

  • Yan, Y., Zhang, H., Zhou, B., 2008. A New Learning Algorithm for Neural Networks with Integer Weights and Quantized Non-linear Activation Functions. International Federation for Information Processing, 276:427–431. [doi:10.1007/978-0-387-09695-7_42]

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jian Bao.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Bao, J., Chen, Y. & Yu, Js. A regeneratable dynamic differential evolution algorithm for neural networks with integer weights. J. Zhejiang Univ. - Sci. C 11, 939–947 (2010). https://doi.org/10.1631/jzus.C1000137

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1631/jzus.C1000137

Key words

CLC number

Navigation