skip to main content
10.1145/3583133.3590619acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
poster

Explaining Memristive Reservoir Computing Through Evolving Feature Attribution

Published: 24 July 2023 Publication History

Abstract

Memristive Reservoir Computing (MRC) is a promising computing architecture for time series tasks, but lacks explainability, leading to unreliable predictions. To address this issue, we propose an evolutionary framework to explain the time series predictions of MRC systems. Our proposed approach attributes the feature importance of the time series via an evolutionary approach to explain the predictions. Our experiments show that our approach successfully identified the most influential factors, demonstrating the effectiveness of our design and its superiority in terms of explanation compared to state-of-the-art methods.

References

[1]
Kyriakos C Chatzidimitriou and Pericles A Mitkas. 2013. Adaptive reservoir computing through evolution and learning. Neurocomputing 103 (2013), 198--209.
[2]
Jonathan Crabbé and Mihaela Van Der Schaar. 2021. Explaining time series predictions with dynamic masks. In International Conference on Machine Learning. PMLR, 2166--2177.
[3]
Matthew Dale, Julian F Miller, Susan Stepney, and Martin A Trefzer. 2016. Evolving carbon nanotube reservoir computers. In Proc. Int. Conf. Unconv. Comput. Nat. Comput. (UCNC). Springer, 49--61.
[4]
Uwe Hübner, Nimmi B Abraham, and Carlos O Weiss. 1989. Dimensions and entropies of chaotic intensity pulsations in a single-mode far-infrared NH 3 laser. Phys. Rev. A 40, 11 (1989), 6354.
[5]
Suhas Kumar, Xinxin Wang, John Paul Strachan, Yuchao Yang, and Wei D Lu. 2022. Dynamical memristors for higher-complexity neuromorphic computing. Nat. Rev. Mater. 7, 7 (2022), 575--591.
[6]
Marco Tulio Ribeiro, Sameer Singh, and Carlos Guestrin. 2016. " Why should i trust you?" Explaining the predictions of any classifier. In Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining. 1135--1144.
[7]
Xinming Shi, Leandro L Minku, and Xin Yao. 2022. Adaptive Memory-enhanced Time Delay Reservoir and Its Memristive Implementation. IEEE Trans. Comput. (2022).
[8]
Xinming Shi, Zhigang Zeng, Le Yang, and Yi Huang. 2018. Memristor-based circuit design for neuron with homeostatic plasticity. IEEE Trans. Emerg. Topics Comput. Intell. 2, 5 (2018), 359--370.
[9]
Avanti Shrikumar, Peyton Greenside, and Anshul Kundaje. 2017. Learning important features through propagating activation differences. In International conference on machine learning. PMLR, 3145--3153.
[10]
Mukund Sundararajan, Ankur Taly, and Qiqi Yan. 2017. Axiomatic attribution for deep networks. In International conference on machine learning. PMLR, 3319--3328.
[11]
Harini Suresh, Nathan Hunt, Alistair Johnson, Leo Anthony Celi, Peter Szolovits, and Marzyeh Ghassemi. 2017. Clinical intervention prediction and understanding using deep networks. arXiv preprint arXiv:1705.08498 (2017).
[12]
Sana Tonekaboni, Shalmali Joshi, Kieran Campbell, David K Duvenaud, and Anna Goldenberg. 2020. What went wrong and when? Instance-wise feature importance for time-series black-box models. Advances in Neural Information Processing Systems 33 (2020), 799--809.
[13]
Shiping Wen, Rui Hu, Yin Yang, Tingwen Huang, Zhigang Zeng, and Yong-Duan Song. 2018. Memristor-based echo state network with online least mean square. IEEE Trans. Syst., Man, Cybern., Syst. 49, 9 (2018), 1787--1796.
[14]
Jingyu Zhao, Feiqing Huang, Jia Lv, Yanjie Duan, Zhen Qin, Guodong Li, and Guangjian Tian. Virtual Event, 2020. Do rnn and lstm have long memory?. In Proc. 37th Int. Conf. Mach. Learn. 11365--11375.
[15]
Zhenyu Zhao, Lianhua Qu, Lei Wang, Quan Deng, Nan Li, Ziyang Kang, Shasha Guo, and Weixia Xu. 2020. A memristor-based spiking neural network with high scalability and learning efficiency. IEEE Trans. Circuits and Syst. II, Exp. Briefs 67, 5 (2020), 931--935.

Cited By

View all
  • (2024)Accelerating surrogate assisted evolutionary algorithms for expensive multi-objective optimization via explainable machine learningSwarm and Evolutionary Computation10.1016/j.swevo.2024.10161088(101610)Online publication date: Jul-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '23 Companion: Proceedings of the Companion Conference on Genetic and Evolutionary Computation
July 2023
2519 pages
ISBN:9798400701207
DOI:10.1145/3583133
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s).

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 July 2023

Check for updates

Author Tags

  1. explainability
  2. evolutionary algorithm
  3. reservoir computing
  4. memristor

Qualifiers

  • Poster

Conference

GECCO '23 Companion
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)37
  • Downloads (Last 6 weeks)9
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Accelerating surrogate assisted evolutionary algorithms for expensive multi-objective optimization via explainable machine learningSwarm and Evolutionary Computation10.1016/j.swevo.2024.10161088(101610)Online publication date: Jul-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media