Skip to main content
Log in

LTI ODE-valued neural networks

Multiple problem solving using a single neural structure

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

A dynamical version of the classical McCulloch & Pitts’ neural model is introduced in this paper. In this new approach, artificial neurons are characterized by: i) inputs in the form of differentiable continuous-time signals, ii) linear time-invariant ordinary differential equations (LTI ODE) for connection weights, and iii) activation functions evaluated in the frequency domain. It will be shown that this new characterization of the constitutive nodes in an artificial neural network, namely LTI ODE-valued neural network (LTI ODEVNN), allows solving multiple problems at the same time using a single neural structure. Moreover, it is demonstrated that LTI ODEVNNs can be interpreted as complex-valued neural networks (CVNNs). Hence, research on this topic can be applied in a straightforward form. Standard boolean functions are implemented to illustrate the operation of LTI ODEVNNs. Concluding the paper, several future research lines are highlighted, including the need for developing learning algorithms for the newly introduced LTI ODEVNNs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. For instance, see 𝟙 piecewise functions Input 1 and Input 2 in Fig. 8.

  2. The example in Section 5 helps illustrating this codification. In particular, see the LTI ODEVNN Output, XOR, OR, AND in Fig. 5, the zoom in the Output in Fig. 7, and the “binary” interpretation of 𝟙(t) piecewise functions XOR, OR, AND in Fig. 8, that will be further explained in the section.

  3. Dropping the factor \(e^{j\omega_{k}t}\) is equivalent to interpret |D i (j ω k )|\(e^{j(w_{k}t+\arg (D_{i}(jw_{k})))}\) on a coordinate system that rotates with respect to the complex plain with frequency ω k .

References

  1. Aoyagi T, Radenamad D, Nakano Y, Hirose A (2010) Complex-valued self-organizing map clustering using complex inner product in active millimeter-wave imaging. In: The 2010 International Joint Conference on Neural Networks (IJCNN), pp 1–6

  2. Arena P, Fortuna L, Occhipinti L, Xibilia M (1994) Neural networks for quaternion-valued function approximation. In: Circuits and Systems, 1994. ISCAS ’94., 1994 IEEE International Symposium, vol 6, pp 307–310

  3. Buchholz S, Sommer G (2008) On Clifford neurons and Clifford multi–layer perceptrons. Neural Netw 21(7):925–935

    Article  MATH  Google Scholar 

  4. Che Ujang B, Took C, Mandic D (2011) Quaternion-valued nonlinear adaptive filtering. Neural Netw IEEE Trans 22(8):1193–1206. doi: 10.1109/TNN.2011.2157358

    Article  Google Scholar 

  5. Chen S, Hong X, Harris CJ, Hanzo L (2008) Fully complex-valued radial basis function networks: orthogonal least squares regression and classification. Neurocomputing 71(16–18):3421–3433. doi: 10.1016/j.neucom.2007.12.003

    Article  Google Scholar 

  6. Franklin GF, Powell DJ, Emami-Naeini A (2001) Feedback control of dynamic systems, 4th edn. Prentice Hall PTR, Upper Saddle River, NJ

    Google Scholar 

  7. Hirose A (2006) Complex-valued neural networks. In: Studies in computational intelligence, vol 32. Springer

  8. Hirose A (2009) Complex-valued neural networks: the merits and their origins. In: International joint conference on neural networks, 1999. IJCNN99. IEEE, Atlanta, Georgia, USA, pp 1237–1244

  9. Hirose A (2010) Recent progress in applications of complex-valued neural networks. In: Proceedings of the 10th international conference on Artifical intelligence and soft computing: Part II, ICAISC’10. Springer-Verlag, Berlin, pp 42–46

    Google Scholar 

  10. Hirose A (2012) Complex-valued neural networks. In: Studies in computational intelligence, vol 400. Springer

  11. Hirose A, Yoshida S (2011) Comparison of complex- and real-valued feedforward neural networks in their generalization ability. In: Lu BL, Zhang L, Kwok JT (eds) Neural information processing - 18th international conference, ICONIP 2011, Part I, Lecture notes in computer science, vol 7062, pp 526–531. Springer, Shanghai

    Google Scholar 

  12. Hirose A, Yoshida S (2012) Generalization characteristics of complex-valued feedforward neural networks in relation to signal coherence. IEEE Trans Neural Netw Learn Syst 23(4):541–551

    Google Scholar 

  13. Hu J, Wang J (2012) Global stability of complex-valued recurrent neural networks with time-delays. IEEE Trans Neural Netw Learn Syst 23(6):853–865

    Google Scholar 

  14. Izhikevich E (2003) Simple model of spiking neurons. IEEE Trans Neural Netw 14(6):1569–1572

    Article  MathSciNet  Google Scholar 

  15. Izhikevich EM (2006) Dynamical systems in neuroscience: the geometry of excitability and bursting (computational neuroscience), 1edn. The MIT Press

  16. Kuroe Y, Tanigawa S, Iima H (2011). In: Lu BL, Zhang L, Kwok JT (eds) Neural information processing - 18th international conference, ICONIP 2011, Part I, Lecture Notes in Computer Science, vol 7062. Springer, Shanghai, pp 560–569

  17. Li P, Xiao H (2014) Model and algorithm of quantum-inspired neural network with sequence input based on controlled rotation gates. Appl Intell 40(1):107–126. doi: 10.1007/s10489-013-0447-3

    Article  Google Scholar 

  18. Mandic DP, Chambers J (2001) Recurrent neural networks for prediction: learning algorithms, architectures and aStability. Wiley, New York

    Book  Google Scholar 

  19. Nitta T (2009) Complex-valued neural networks: utilizing high-dimensional parameters. Information science reference. Imprint of: IGI Publishing, Hershey

    Book  Google Scholar 

  20. Pearson JK (1995) Clifford networks. Ph.D. thesis, University of Kent

  21. Ponulak F, Kasinski A (2011) Introduction to spiking neural networks: Information processing, learning and applications. Acta Neurobiol Exp (Wars) 71(4):409–33

    Google Scholar 

  22. Rattan SSP, Hsieh WW (2005) Complex-valued neural networks for nonlinear complex principal component analysis. Neural Netw 18(1):61–69. doi: 10.1016/j.neunet.2004.08.002

    Article  MATH  Google Scholar 

  23. Savitha R, Suresh S, Sundararajan N, Kim H (2012) A fully complex-valued radial basis function classifier for real-valued classification problems. Neurocomputing 78(1):104–110. doi: 10.1016/j.neucom.2011.05.036

    Article  Google Scholar 

  24. Sheikhan M (2014) Generation of suprasegmental information for speech using a recurrent neural network and binary gravitational search algorithm for feature selection. Appl Intell:1–19. doi: 10.1007/s10489-013-0505-x

  25. Velasco M, MartĂ­n EX, Angulo C, MartĂ­ P (2013) LTI ODE-valued neuronal networks: solving multiple problems using a single network structure. Research report ESAII-RR-13-01, Automatic Control Department, Technical University of Catalonia

Download references

Acknowledgments

The authors would like to thank the editor and the anonymous reviewers for their valuable comments and suggestions to improve the quality of the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cecilio Angulo.

Additional information

This work is partly supported by the Spanish Government (projects TIN2012-38416-C03-01 and DPI2010-18601).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Velasco, M., Martín, E.X., Angulo, C. et al. LTI ODE-valued neural networks. Appl Intell 41, 594–605 (2014). https://doi.org/10.1007/s10489-014-0548-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-014-0548-7

Keywords

Navigation