Skip to main content

Double-Layered Cortical Learning Algorithm for Time-Series Prediction

  • Conference paper
  • First Online:
Bio-Inspired Information and Communications Technologies (BICT 2021)

Abstract

This work proposes a double-layered cortical learning algorithm. The cortical learning algorithm is a time-series prediction methodology inspired from the human neuro-cortex. The human neuro-cortex has a multi-layer structure, while the conventional cortical learning algorithm has a single layer structure. This work introduces a double-layered structure into the cortical learning algorithm. The first layer represents the input data and its context every time-step. The input data context presentation in the first layer is transferred to the second layer, and it is represented in the second layer as an abstract representation. Also, the abstract prediction in the second layer is reflected to the first layer to modify and enhance the prediction in the first layer. The experimental results show that the proposed double-layered cortical learning algorithm achieves higher prediction accuracy than the conventional single-layered cortical learning algorithms and the recurrent neural networks with the long short-term memory on several artificial time-series data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Hawkins, J., Blakeslee, S.: On Intelligence. Times Books, New York (2004)

    Google Scholar 

  2. Ahmad, S., Hawkins, J.: Properties of Sparse Distributed Representations and their Application to Hierarchical Temporal Memory. Technical Report, pp. 1–18 (2015)

    Google Scholar 

  3. Zyarah, A.M., Kudithipudi, D.: Neuromemrisitive architecture of HTM with on-device learning and neurogenesis. ACM J. Emerg. Technol. Comput. Syst. 15(3), 24 (2019). Article 24

    Google Scholar 

  4. Hawkins, J., Subutai, A., Dubinsky, D.: Hierarchical temporal memory including HTM cortical learning algorithms. Technical Report, Numenta Inc., (2010)

    Google Scholar 

  5. Hawkins, J., Subutai, A.: Why neurons have thousands of synapses, a theory of sequence memory in neocortex. Front. Neural Circuits 10, 1–13 (2016)

    Article  Google Scholar 

  6. https://github.com/numenta/nupic. Accessed 1 Mar 2021

  7. Elman, J.L.: Finding structure in time. Cogn. Sci. 14(2), 179–211 (1990)

    Article  Google Scholar 

  8. Connor, J.T., Martin, R.D., Atlas, L.E.: Recurrent neural networks and robust time series prediction. IEEE Trans. Neural Networks 5(2), 240–254 (1994)

    Article  CAS  Google Scholar 

  9. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  CAS  Google Scholar 

  10. Cui, Y., Ahmad, S., Hawkins, J.: Continuous online sequence learning with an unsupervised neural network model. Neural Comput. 28(11), 2474–2504 (2016)

    Article  Google Scholar 

  11. Aoki, T., Takadama, K., Sato, H.: A preliminary study on a multi-layered cortical learning algorithm. In: The 7th UEC Seminar in ASEAN, 2020 and The 2nd ASEAN-UEC Workshop on Energy and AI (2020)

    Google Scholar 

  12. Aoki, T., Takadama, K., Sato, H.: Study on simple cortical learning algorithm and prediction accuracy improvement. In: SSI 2017, The Society of Instrument and Control Engineers, pp. 135–140 (2017) (in Japanese)

    Google Scholar 

  13. Kingma, D.P., Ba, J.: Adam: A Method for Stochastic Optimization. CoRR arXiv:1412.6980 (2014)

  14. Hornik, K.: Approximation capabilities of multilayer feedforward networks. Neural Network 4(2), 251–257 (1991)

    Article  Google Scholar 

  15. https://github.com/keras-team/keras. Accessed 30 June 2021

Download references

Acknowledgments

This work was supported by JSPS KAKENHI Grant Number 20J14240.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Takeru Aoki .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Aoki, T., Takadama, K., Sato, H. (2021). Double-Layered Cortical Learning Algorithm for Time-Series Prediction. In: Nakano, T. (eds) Bio-Inspired Information and Communications Technologies. BICT 2021. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 403. Springer, Cham. https://doi.org/10.1007/978-3-030-92163-7_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-92163-7_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-92162-0

  • Online ISBN: 978-3-030-92163-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics