Skip to main content

Advertisement

Adaptive spiking neuron with population coding for a residual spiking neural network

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Spiking neural networks (SNNs) have attracted significant research attention due to their inherent sparsity and event-driven processing capabilities. Recent studies indicate that the incorporation of convolutional and residual structures into SNNs can substantially enhance performance. However, these converted spiking residual structures are associated with increased complexity and stacked parameterized spiking neurons. To address this challenge, this paper proposes a meticulously refined two-layer decision structure for residual-based SNNs, consisting solely of fully connected and spiking neuron layers. Specifically, the spiking neuron layers incorporate an innovative dynamic leaky integrate-and-fire (DLIF) neuron model with a nonlinear self-feedback mechanism, characterized by dynamic threshold adjustment and a self-regulating firing rate. Furthermore, diverging from traditional direct encoding, which focuses solely on individual neuronal frequency, we introduce a novel mixed coding mechanism that combines direct encoding with multineuronal population decoding. The proposed architecture improves the adaptability and responsiveness of spiking neurons in various computational contexts. Experimental results demonstrate the superior efficacy of our approach. Although it uses a highly simplified structure with only 6 timesteps, our proposal achieves enhanced performance in the experimental trials compared to multiple state-of-the-art methods. Specifically, it achieves accuracy improvements of 0.01-1.99% on three static datasets and of 0.14-7.50% on three N-datasets. The DLIF model excels in information processing, showing double mutual information compared to other neurons. In the sequential MNIST dataset, it balances biological realism and practicality, enhancing memory and the dynamic range. Our proposed method not only offers improved computational efficacy and simplified network structure but also enhances the biological plausibility of SNN models and can be easily adapted to other deep SNNs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Algorithm 1
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data availability and access

The data used to support the findings of this study are available from the corresponding author upon request.

References

  1. Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. Adv Neural Inf Process Syst 25

  2. Dey RK, Das AK (2023) Modified term frequency-inverse document frequency based deep hybrid framework for sentiment analysis. Multimed Tools Appl 82(21):32967–32990

    Article  MATH  Google Scholar 

  3. Dey RK, Das AK (2024) Neighbour adjusted dispersive flies optimization based deep hybrid sentiment analysis framework. Multimed Tools Appl pp 1–24

  4. Wang Z, Yan W, Oates T (2017) Time series classification from scratch with deep neural networks: A strong baseline. In: 2017 International joint conference on neural networks (IJCNN), pp 1578–1585

  5. Lim B, Zohren S (2021) Time-series forecasting with deep learning: a survey. Philos Trans R Soc A 379(2194):20200209

    Article  MathSciNet  MATH  Google Scholar 

  6. LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521(7553):436–444

    Article  MATH  Google Scholar 

  7. Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85–117

    Article  MATH  Google Scholar 

  8. Ge Y, Li Z, Yue X, Li H, Li Q, Meng L (2024) Iot-based automatic deep learning model generation and the application on empty-dish recycling robots. Internet of Things 25:101047

    Article  MATH  Google Scholar 

  9. Wen S, Wei H, Zeng Z, Huang T (2018) Memristive fully convolutional network: an accurate hardware image-segmentor in deep learning. IEEE Trans Emerg Top Comput Intell 2(5):324–334

    Article  MATH  Google Scholar 

  10. Tan Y, Zhao G (2022) Multi-view representation learning with kolmogorov-smirnov to predict default based on imbalanced and complex dataset. Inf Sci 596:380–394

    Article  MATH  Google Scholar 

  11. Tan Y, Zhu H, Jie W, Chai H (2024) Dptvae: Data-driven prior-based tabular variational autoencoder for credit data synthesizing. Expert Syst Appl 241:122071

    Article  Google Scholar 

  12. Thompson NC, Greenewald K, Lee K, Manso GF (2021) Deep learning’s diminishing returns: The cost of improvement is becoming unsustainable. IEEE Spectr 58(10):50–55

    Article  Google Scholar 

  13. Dampfhoffer M, Mesquida T, Valentian A, Anghel L (2022) Are snns really more energy-efficient than anns? an in-depth hardware-aware study. IEEE Trans Emerg Top Comput Intell

  14. Maass W (1997) Networks of spiking neurons: the third generation of neural network models. Neural Netw 10(9):1659–1671

    Article  MATH  Google Scholar 

  15. Tang G, Shah A, Michmizos KP (2019) Spiking neural network on neuromorphic hardware for energy-efficient unidimensional slam. In: 2019 IEEE/RSJ International conference on intelligent robots and systems (IROS). IEEE, pp 4176–4181

  16. Eshraghian JK, Ward M, Neftci E, Wang X, Lenz G, Dwivedi G, Bennamoun M, Jeong DS, Lu WD (2023) Training spiking neural networks using lessons from deep learning. Proc IEEE 111(9):1016–1054

    Article  Google Scholar 

  17. Akopyan F, Sawada J, Cassidy A, Alvarez-Icaza R, Arthur J, Merolla P, Imam N, Nakamura Y, Datta P, Nam G-J et al (2015) Truenorth: Design and tool flow of a 65 mw 1 million neuron programmable neurosynaptic chip. IEEE Trans Comput-Aided Des Integr Circuits Syst 34(10):1537–1557

    Article  Google Scholar 

  18. Davies M, Srinivasa N, Lin T-H, Chinya G, Cao Y, Choday SH, Dimou G, Joshi P, Imam N, Jain S et al (2018) Loihi: A neuromorphic manycore processor with on-chip learning. Ieee Micro 38(1):82–99

    Article  Google Scholar 

  19. Mayr C, Hoeppner S, Furber S (2019) Spinnaker 2: A 10 million core processor system for brain simulation and machine learning. arXiv:1911.02385

  20. Hagenaars J, Paredes-Vallés F, De Croon G (2021) Self-supervised learning of event-based optical flow with spiking neural networks. Adv Neural Inf Process Syst 34:7167–7179

    MATH  Google Scholar 

  21. Deng L, Wu Y, Hu X, Liang L, Ding Y, Li G, Zhao G, Li P, Xie Y (2020) Rethinking the performance comparison between snns and anns. Neural Netw 121:294–307

    Article  MATH  Google Scholar 

  22. Rueckauer B, Lungu I-A, Hu Y, Pfeiffer M, Liu S-C (2017) Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front Neurosci 11:682

    Article  MATH  Google Scholar 

  23. Ding J, Yu Z, Tian Y, Huang T (2021) Optimal ann-snn conversion for fast and accurate inference in deep spiking neural networks. arXiv:2105.11654

  24. Diehl PU, Neil D, Binas J, Cook M, Liu S-C, Pfeiffer M (2015) Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In: 2015 International joint conference on neural networks (IJCNN). IEEE, pp 1–8

  25. Li Y, Zhao D, Zeng Y (2022) Bsnn: Towards faster and better conversion of artificial neural networks to spiking neural networks with bistable neurons. Front Neurosci 16:991851

    Article  Google Scholar 

  26. Neftci EO, Mostafa H, Zenke F (2019) Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Proc Mag 36(6):51–63

    Article  MATH  Google Scholar 

  27. Fang W, Yu Z, Chen Y, Huang T, Masquelier T, Tian Y (2021) Deep residual learning in spiking neural networks. Adv Neural Inf Process Syst 34:21056–21069

    MATH  Google Scholar 

  28. Han B, Srinivasan G, Roy K (2020) Rmp-snn: Residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 13558–13567

  29. Fang H, Taylor B, Li Z, Mei Z, Li HH, Qiu Q (2021) Neuromorphic algorithm-hardware codesign for temporal pattern learning. In: 2021 58th ACM/IEEE Design Automation Conference (DAC). IEEE, pp 361–366

  30. Rathi N, Roy K (2021) Diet-snn: A low-latency spiking neural network with direct input encoding and leakage and threshold optimization. IEEE Trans Neural Netw Learn Syst

  31. Zhang M, Wang J, Wu J, Belatreche A, Amornpaisannon B, Zhang Z, Miriyala VPK, Qu H, Chua Y, Carlson TE et al (2021) Rectified linear postsynaptic potential function for backpropagation in deep spiking neural networks. IEEE Trans Neural Netw Learn Syst 33(5):1947–1958

    Article  Google Scholar 

  32. Pouget A, Dayan P, Zemel R (2000) Information processing with population codes. Nat Rev Neurosci 1(2):125–132

    Article  MATH  Google Scholar 

  33. Georgopoulos AP, Schwartz AB, Kettner RE (1986) Neuronal population coding of movement direction. Science 233(4771):1416–1419

    Article  MATH  Google Scholar 

  34. Li H, Meng L (2023) Hardware-aware approach to deep neural network optimization. Neurocomputing 559:126808

    Article  MATH  Google Scholar 

  35. Glorot X, Bordes A, Bengio Y (2011) Deep sparse rectifier neural networks. In: Proceedings of the fourteenth international conference on artificial intelligence and statistics, pp 315–323. JMLR Workshop and Conference Proceedings

  36. Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323(6088):533–536

    Article  MATH  Google Scholar 

  37. Li H, Wang Z, Yue X, Wang W, Tomiyama H, Meng L (2023) An architecture-level analysis on deep learning models for low-impact computations. Artif Intell Rev 56:1971–2010

    Article  MATH  Google Scholar 

  38. Hodgkin AL, Huxley AF (1952) A quantitative description of membrane current and its application to conduction and excitation in nerve. J Physiol 117(4):500

    Article  MATH  Google Scholar 

  39. Abbott LF (1999) Lapicque’s introduction of the integrate-and-fire model neuron (1907). Brain Res Bull 50(5–6):303–304

    Article  MATH  Google Scholar 

  40. Hunsberger E, Eliasmith C (2015) Spiking deep networks with lif neurons. arXiv:1510.08829

  41. Fang W, Yu Z, Chen Y, Masquelier T, Huang T, Tian Y (2021) Incorporating learnable membrane time constant to enhance learning of spiking neural networks. In Proceedings of the IEEE/CVF international conference on computer vision, pp 2661–2671

  42. Jiang C, Zhang Y (2023) Klif: An optimized spiking neuron unit for tuning surrogate gradient slope and membrane potential. arXiv:2302.09238

  43. Das B, Schulze J, Ganguly U (2018) Ultra-low energy lif neuron using si nipin diode for spiking neural networks. IEEE Electron Device Letters 39(12):1832–1835

    Article  Google Scholar 

  44. Fang W, Yu Z, Zhou Z, Chen Y, Ma Z, Masquelier T, Tian Y (2023) Parallel spiking neurons with high efficiency and long-term dependencies learning ability. arXiv:2304.12760

  45. Wang Q, Zhang T, Han M, Wang Y, Zhang D, Xu B (2023) Complex dynamic neurons improved spiking transformer network for efficient automatic speech recognition. Proc AAAI Conf Artif Intell 37:102–109

    MATH  Google Scholar 

  46. Chakraborty B, Mukhopadhyay S (2023) Heterogeneous neuronal and synaptic dynamics for spike-efficient unsupervised learning: Theory and design principles. arXiv:2302.11618

  47. Zhang S, Yang Q, Ma C, Wu J, Li H, Tan KC (2024) Tc-lif: A two-compartment spiking neuron model for long-term sequential modelling. In: Proceedings of the AAAI Conference on Artificial Intelligence 38:16838–16847

  48. Song Z, Wu J, Zhang M, Shou MZ, Li H (2024) Spiking-leaf: A learnable auditory front-end for spiking neural networks. In: ICASSP 2024-2024 IEEE International conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 226–230

  49. Tal D, Schwartz EL (1997) Computing with the leaky integrate-and-fire neuron: logarithmic computation and multiplication. Neural Comput 9(2):305–318

    Article  MATH  Google Scholar 

  50. Bi G-Q, Poo M-M (1998) Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type. J Neurosci 18(24):10464–10472

    Article  MATH  Google Scholar 

  51. Hebb DO (2005) The organization of behavior: a neuropsychological theory. Psychology press

  52. Caporale N, Dan Y (2008) Spike timing-dependent plasticity: a hebbian learning rule. Annu. Rev. Neurosci. 31:25–46

    Article  MATH  Google Scholar 

  53. Rathi N, Roy K (2018) Stdp based unsupervised multimodal learning with cross-modal processing in spiking neural networks. IEEE Trans Emerg Top Comput Intell 5(1):143–153

    Article  MATH  Google Scholar 

  54. Zhang H, Gang C, Xu C, Gong G, Lu H (2021) Brain-inspired spiking neural network using superconducting devices. IEEE Trans Emerg Top Comput Intell

  55. Tavanaei A, Maida AS (2016) Bio-inspired spiking convolutional neural network using layer-wise sparse coding and stdp learning. arXiv:1611.03000

  56. Sutton RS (1984) Temporal credit assignment in reinforcement learning. University of Massachusetts Amherst

  57. Gu P, Xiao R, Pan G, Tang H (2019) Stca: Spatio-temporal credit assignment with delayed feedback in deep spiking neural networks. In: IJCAI 15:1366–1372

  58. Kheradpisheh SR, Ganjtabesh M, Thorpe SJ, Masquelier T (2018) Stdp-based spiking deep convolutional neural networks for object recognition. Neural Netw 99:56–67

    Article  MATH  Google Scholar 

  59. Lee C, Panda P, Srinivasan G, Roy K (2018) Training deep spiking convolutional neural networks with stdp-based unsupervised pre-training followed by supervised fine-tuning. Front Neurosci 12:435

    Article  Google Scholar 

  60. Liu F, Zhao W, Chen Y, Wang Z, Jiang L (2022) Spikeconverter: An efficient conversion framework zipping the gap between artificial neural networks and spiking neural networks. In: Proceedings of the AAAI Conference on Artificial Intelligence 36:1692–1701

  61. Kim J, Kim H, Huh S, Lee J, Choi K (2018) Deep neural networks with weighted spikes. Neurocomputing 311:373–386

    Article  MATH  Google Scholar 

  62. Kim Y, Park H, Moitra A, Bhattacharjee A, Venkatesha Y, Panda P (2022) Rate coding or direct coding: Which one is better for accurate, robust, and energy-efficient spiking neural networks? In: ICASSP 2022-2022 IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 71–75

  63. Wu Y, Deng L, Li G, Zhu J, Shi L (2018) Spatio-temporal backpropagation for training high-performance spiking neural networks. Front Neurosci 12:331

    Article  MATH  Google Scholar 

  64. Stöckl C, Maass W (2021) Optimized spiking neurons can classify images with high accuracy through temporal coding with two spikes. Nat Mach Intell 3(3):230–238

    Article  MATH  Google Scholar 

  65. Adam K (2022) A time encoding approach to training spiking neural networks. In: ICASSP 2022-2022 IEEE International conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 5957–5961

  66. Chen HT, Ng KT, Bermak A, Law MK, Martinez D (2011) Spike latency coding in biologically inspired microelectronic nose. IEEE Trans Biomed Circuits Syst 5(2):160–168

    Article  Google Scholar 

  67. Hao Z, Shi X, Huang Z, Bu T, Yu Z, Huang T (2023) A progressive training framework for spiking neural networks with learnable multi-hierarchical model. In: The Twelfth International Conference on Learning Representations

  68. Wang L, Yu Z (2024) Autaptic synaptic circuit enhances spatio-temporal predictive learning of spiking neural networks. arXiv:2406.00405

  69. Lv C, Wang Y, Han D, Zheng X, Huang X, Li D (2024) Efficient and effective time-series forecasting with spiking neural networks. arXiv:2402.01533

  70. Quiroga RQ, Panzeri S (2009) Extracting information from neuronal populations: information theory and decoding approaches. Nat Rev Neurosci 10(3):173–185

    Article  MATH  Google Scholar 

  71. Kettner RE, Schwartz AB, Georgopoulos AP (1988) Primate motor cortex and free arm movements to visual targets in three-dimensional space. III positional gradients and population coding of movement direction from various movement origins. J NeuroSci 8(8):2938–2947

    Article  Google Scholar 

  72. Tang G, Kumar N, Yoo R, Michmizos K (2021) Deep reinforcement learning with population-coded spiking neural network for continuous control. In: Conference on Robot Learning, pages 2016–2029. PMLR

  73. Zhang D, Zhang T, Jia S, Cheng X, Xu B (2021) Population-coding and dynamic-neurons improved spiking actor network for reinforcement learning. arXiv:2106.07854

  74. Toyoizumi T, Aihara K, Amari S-I (2006) Fisher information for spike-based population decoding. Phys Rev Lett 97(9):098102

    Article  MATH  Google Scholar 

  75. Guo Y, Chen Y, Zhang L, Liu X, Tong X, Ou Y, Huang X, Ma Z (2023) Inflor-snn: Reducing information loss for spiking neural networks

  76. Zhao D, Zeng Y, Li Y (2022) Backeisnn: A deep spiking neural network with adaptive self-feedback and balanced excitatory-inhibitory neurons. Neural Netw 154:68–77

    Article  MATH  Google Scholar 

  77. Li H, Yue X, Zhao C, Meng L (2023) Lightweight deep neural network from scratch. Appl Intell 53:18868–19886

    Article  Google Scholar 

  78. Ho N-D, Chang I-J (2021) Tcl: an ann-to-snn conversion with trainable clipping layers. In: 2021 58th ACM/IEEE Design Automation Conference (DAC). IEEE, pp 793–798

  79. Masquelier T, Thorpe SJ (2007) Unsupervised learning of visual features through spike timing dependent plasticity. PLoS Comput Biol 3(2):e31

    Article  MATH  Google Scholar 

  80. Orchard G, Meyer C, Etienne-Cummings R, Posch C, Thakor N, Benosman R (2015) Hfirst: A temporal approach to object recognition. IEEE Trans Pattern Anal Mach Intell 37(10):2028–2040

    Article  Google Scholar 

  81. Yang W, Tipparaju SL, Chen G, Li N (2022) Thalamus-driven functional populations in frontal cortex support decision-making. Nat Neurosci 25(10):1339–1352

  82. Nishino R, Loomis SHC (2017) Cupy: A numpy-compatible library for nvidia gpu calculations. 31st confernce on neural information processing systems, 151(7)

  83. Fang W, Chen Y, Ding J, Yu Z, Masquelier T, Chen D, Huang L, Zhou H, Li G, Tian Y (2023) Spikingjelly: An open-source machine learning infrastructure platform for spike-based intelligence. Sci Adv 9(40):eadi1480

    Article  Google Scholar 

  84. Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z, Gimelshein N, Antiga L et al (2019) Pytorch: An imperative style, high-performance deep learning library. Adv Neural Inf Process Syst 32

  85. Belghazi MI, Baratin A, Rajeshwar S, Ozair S, Bengio Y, Courville A, Hjelm D (2018) Mutual information neural estimation. In: International conference on machine learning. PMLR, pp 531–540

  86. Toyoizumi T, Pfister J-P, Aihara K, Gerstner W (2004) Spike-timing dependent plasticity and mutual information maximization for a spiking neuron model. Adv Neural Inf Process Syst 17

  87. Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27(3):379–423

    Article  MathSciNet  MATH  Google Scholar 

  88. Xiao H, Rasul K, Vollgraf R (2017) Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv:1708.07747

  89. Wang Y, Zhang M, Chen Y, Qu H (2022) Signed neuron with memory: Towards simple, accurate and high-efficient ann-snn conversion. In: International Joint Conference on Artificial Intelligence

  90. Zheng H, Wu Y, Deng L, Hu Y, Li G (2021) Going deeper with directly-trained larger spiking neural networks. In: Proceedings of the AAAI conference on artificial intelligence vol 35, pp 11062–11070

  91. Hu Y, Tang H, Pan G (2021) Spiking deep residual networks. IEEE Trans Neural Netw Learn Syst

  92. Orchard G, Jayawant A, Cohen GK, Thakor N (2015) Converting static image datasets to spiking neuromorphic datasets using saccades. Front Neurosci 9:437

    Article  Google Scholar 

  93. Li H, Liu H, Ji X, Li G, Shi L (2017) Cifar10-dvs: an event-stream dataset for object classification. Front Neurosci 11:309

    Article  MATH  Google Scholar 

  94. Amir A, Taba B, Berg D, Melano T, McKinstry J, Nolfo CD, Nayak T, Andreopoulos A, Garreau G, Mendoza M et al (2017) A low power, fully event-based gesture recognition system. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 7243–7252

  95. Shrestha SB, Zhu L, Sun P (2022) Spikemax: Spike-based loss methods for classification. In: 2022 International Joint Conference on Neural Networks (IJCNN), pp 1–7. IEEE

  96. Feng L, Liu Q, Tang H, Ma D, Pan G (2022) Multi-level firing with spiking ds-resnet: Enabling better and deeper directly-trained spiking neural networks. arXiv:2210.06386

  97. Duan C, Ding J, Chen S, Yu Z, Huang T (2022) Temporal effective batch normalization in spiking neural networks. Adv Neural Inf Process Syst 35:34377–34390

    Google Scholar 

  98. Chen G, Peng P, Li G, Tian Y (2023) Training full spike neural networks via auxiliary accumulation pathway. arXiv:2301.11929

Download references

Acknowledgements

This research is supported by the Henan Provincial Science and Technology Research Project (242102210006).

Author information

Authors and Affiliations

Authors

Contributions

Yongping Dan: Conceptualization; Funding acquisition; Project administration; Resources; Supervision; Visualization; Writing original draft. Changhao Sun: Conceptualization; Formal analysis; Investigation; Methodology; Software; Visualization; Writing original draft. Hengyi Li: Formal analysis; Validation; Visualization; Writing review and editing. Lin Meng: Formal analysis; Validation; Visualization; Writing review and editing.

Corresponding author

Correspondence to Yongping Dan.

Ethics declarations

Competing Interests

All authors declare that they have no competing financial interests or personal relationships that could influence the work reported in this paper.

Ethical and informed consent for the data used

This work does not include studies with human participants or animals. The statement of informed consent is not applicable because the manuscript does not contain any patient data.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dan, Y., Sun, C., Li, H. et al. Adaptive spiking neuron with population coding for a residual spiking neural network. Appl Intell 55, 288 (2025). https://doi.org/10.1007/s10489-024-06128-z

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10489-024-06128-z

Keywords