Abstract:
Human brain effectively integrates prior knowledge to new skills by transferring experience across tasks without suffering from catastrophic forgetting. In this study, to...Show MoreMetadata
Abstract:
Human brain effectively integrates prior knowledge to new skills by transferring experience across tasks without suffering from catastrophic forgetting. In this study, to continuously learn a visual classification task sequence (PermutedMNIST), we employed a neural network model with lateral connections, sparse group Least Absolute Shrinkage And Selection Operator (LASSO) regularization and projection regularization to decrease feature redundancy. We show that encouraging feature novelty on progressive neural networks (PNN) prevents major performance decrease on sparsification, sparsification of a progressive neural network produces fair results and decreases the number of learned task-specific parameters on novel tasks.
Date of Conference: 05-07 October 2020
Date Added to IEEE Xplore: 07 January 2021
ISBN Information:
Print on Demand(PoD) ISSN: 2165-0608