Abstract:
It has been demonstrated that the computational capabilities of echo state networks are maximized when the recurrent layer is close to the border between a stable and an ...Show MoreMetadata
Abstract:
It has been demonstrated that the computational capabilities of echo state networks are maximized when the recurrent layer is close to the border between a stable and an unstable dynamics regime, the so called edge of stability, or criticality. The maximization of performance is computationally useful, leading to minimal prediction error or maximal memory capacity, and has been shown to lead to maximization of information-theoretic measures, such as transfer entropy and active information storage in case of some datasets. In this paper, we take a closer look at these measures, using Kraskov-Grassberger-Stögbauer estimator with optimized parameters. We experiment with four datasets differing in the data complexity, and discover interesting differences, compared to the previous work, such as more complex behavior of the information-theoretic measures. We also investigate the effect of reservoir orthogonalization, that has been shown earlier to maximize memory capacity, on the prediction accuracy and the above mentioned measures.
Date of Conference: 08-13 July 2018
Date Added to IEEE Xplore: 14 October 2018
ISBN Information:
Electronic ISSN: 2161-4407