Abstract
This work proposes a double-layered cortical learning algorithm. The cortical learning algorithm is a time-series prediction methodology inspired from the human neuro-cortex. The human neuro-cortex has a multi-layer structure, while the conventional cortical learning algorithm has a single layer structure. This work introduces a double-layered structure into the cortical learning algorithm. The first layer represents the input data and its context every time-step. The input data context presentation in the first layer is transferred to the second layer, and it is represented in the second layer as an abstract representation. Also, the abstract prediction in the second layer is reflected to the first layer to modify and enhance the prediction in the first layer. The experimental results show that the proposed double-layered cortical learning algorithm achieves higher prediction accuracy than the conventional single-layered cortical learning algorithms and the recurrent neural networks with the long short-term memory on several artificial time-series data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Hawkins, J., Blakeslee, S.: On Intelligence. Times Books, New York (2004)
Ahmad, S., Hawkins, J.: Properties of Sparse Distributed Representations and their Application to Hierarchical Temporal Memory. Technical Report, pp. 1–18 (2015)
Zyarah, A.M., Kudithipudi, D.: Neuromemrisitive architecture of HTM with on-device learning and neurogenesis. ACM J. Emerg. Technol. Comput. Syst. 15(3), 24 (2019). Article 24
Hawkins, J., Subutai, A., Dubinsky, D.: Hierarchical temporal memory including HTM cortical learning algorithms. Technical Report, Numenta Inc., (2010)
Hawkins, J., Subutai, A.: Why neurons have thousands of synapses, a theory of sequence memory in neocortex. Front. Neural Circuits 10, 1–13 (2016)
https://github.com/numenta/nupic. Accessed 1 Mar 2021
Elman, J.L.: Finding structure in time. Cogn. Sci. 14(2), 179–211 (1990)
Connor, J.T., Martin, R.D., Atlas, L.E.: Recurrent neural networks and robust time series prediction. IEEE Trans. Neural Networks 5(2), 240–254 (1994)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Cui, Y., Ahmad, S., Hawkins, J.: Continuous online sequence learning with an unsupervised neural network model. Neural Comput. 28(11), 2474–2504 (2016)
Aoki, T., Takadama, K., Sato, H.: A preliminary study on a multi-layered cortical learning algorithm. In: The 7th UEC Seminar in ASEAN, 2020 and The 2nd ASEAN-UEC Workshop on Energy and AI (2020)
Aoki, T., Takadama, K., Sato, H.: Study on simple cortical learning algorithm and prediction accuracy improvement. In: SSI 2017, The Society of Instrument and Control Engineers, pp. 135–140 (2017) (in Japanese)
Kingma, D.P., Ba, J.: Adam: A Method for Stochastic Optimization. CoRR arXiv:1412.6980 (2014)
Hornik, K.: Approximation capabilities of multilayer feedforward networks. Neural Network 4(2), 251–257 (1991)
https://github.com/keras-team/keras. Accessed 30 June 2021
Acknowledgments
This work was supported by JSPS KAKENHI Grant Number 20J14240.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering
About this paper
Cite this paper
Aoki, T., Takadama, K., Sato, H. (2021). Double-Layered Cortical Learning Algorithm for Time-Series Prediction. In: Nakano, T. (eds) Bio-Inspired Information and Communications Technologies. BICT 2021. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 403. Springer, Cham. https://doi.org/10.1007/978-3-030-92163-7_4
Download citation
DOI: https://doi.org/10.1007/978-3-030-92163-7_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-92162-0
Online ISBN: 978-3-030-92163-7
eBook Packages: Computer ScienceComputer Science (R0)