Abstract:
Neuroplasticity, the adaptive capacity of the nervous system, spans a broad spectrum from short-term synaptic adjustments to the establishment of new neuronal connections...Show MoreMetadata
Abstract:
Neuroplasticity, the adaptive capacity of the nervous system, spans a broad spectrum from short-term synaptic adjustments to the establishment of new neuronal connections. While artificial neural networks (ANNs) draw inspiration from the human brain, their current implementations primarily focus on synaptic weight plasticity, notably using the backpropagation algorithm (BP). Despite its machine learning prowess, BP deviates from biological realism due to imposed constraints and lacks the brain’s superior generalization capabilities. To bridge this gap, previous works have proposed a local competitive learning rule aligned with Hebbian plasticity, yet it presents challenges in sparsity and adherence to Dale’s law. This work addresses these challenges by integrating this competitive learning rule with additional mechanisms. A nonnegativity constraint is introduced to align with Dale’s law and to induce sparsity. Weight Perturbation (WP) is employed as a biologically plausible surrogate gradient. Homeostatic plasticity counters long-term potentiation (LTP) induced by Hebbian plasticity. The resulting framework offers a biologically plausible model, maintaining robust classification performance despite the constraint of a purely excitatory network. This learning rule demonstrates strong sparsification abilities, even pruning entire hidden neurons, potentially preventing overparametrization, thereby fostering the potential for better generalization.
Date of Conference: 30 June 2024 - 05 July 2024
Date Added to IEEE Xplore: 09 September 2024
ISBN Information: