Impact Statement:Sparse representation is an essential branch of machine learning. The sparsity has been considered to solve the limitation of computation resources in practical deep lear...Show More
Abstract:
In recent years, the enormous demand for computing resources resulting from massive data and complex network models has become the limitation of deep learning. In the lar...Show MoreMetadata
Impact Statement:
Sparse representation is an essential branch of machine learning. The sparsity has been considered to solve the limitation of computation resources in practical deep learning applications. Unlike previous reviews, our survey comprehensively refers to sparse cognition, the biology-inspired mechanism and modeling, and the traditional sparse learning methods. In addition, the sparse deep networks and the related applications are comprehensively investigated, which is inspiring for deep learning learners. Besides, ten public issues and challenges of sparse deep learning are presented, which will significantly influence the further development of the new generation sparse learning.
Abstract:
In recent years, the enormous demand for computing resources resulting from massive data and complex network models has become the limitation of deep learning. In the large-scale problems with massive samples and ultrahigh feature dimensions, sparsity has gradually drawn much attention from academia and the industrial field. In this article, the new generation of brain-inspired sparse learning is reviewed comprehensively. First, sparse cognition learning is introduced from the visual biology mechanism to modeling for the natural image. Second, the sparse representation algorithms are summarized to sort out the research progress of sparse learning. Third, the relevant research on sparse feature selection learning is reviewed. Then, the sparse deep networks and applications are summed up. Last but not least, ten public issues and challenges of sparse learning are discussed. By investigating the development process of sparse learning, this article summarizes the advantages, disadvantages, limitations, and future research directions of the algorithm, which can help readers conduct further study.
Published in: IEEE Transactions on Artificial Intelligence ( Volume: 3, Issue: 6, December 2022)