Abstract
In this paper, we propose an overcomplete nonnegative dictionary learning method for sparse representation of signals by posing it as a problem of nonnegative matrix factorization (NMF) with a sparsity constraint. By introducing the sparsity constraint, we show that the problem can be cast as two sequential optimal problems of parabolic functions, although the forms of parabolic functions are different from that of the case without the constraint [1,2]. So that the problems can be efficiently solved by generalizing the hierarchical alternating least squares (HALS) algorithm, since the original HALS can work only for the case without the constraint. The convergence of dictionary learning process is fast and the computational cost is low. Numerical experiments show that the algorithm performs better than the nonnegative K-SVD (NN-KSVD) and the other two compared algorithms, and the computational cost is remarkably reduced either.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Cichocki, A., Zdunek, R., Amari, S.-I.: Hierarchical ALS Algorithms for Nonnegative Matrix and 3D Tensor Factorization. In: Davies, M.E., James, C.J., Abdallah, S.A., Plumbley, M.D. (eds.) ICA 2007. LNCS, vol. 4666, pp. 169–176. Springer, Heidelberg (2007)
Cichocki, A., Phan, A.-H.: Fast local algorithms for large scale nonnegative matrix and tensor factorizations. IEICE Trans. on Fundamentals of Electronics E92-A(3), 708–721 (2009)
Tošić, I., Frossard, P.: Dictionary learning. IEEE Signal Processing Magazine 28(2), 27–38 (2011)
Olshausen, B.A., Field, D.J.: Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature 381, 607–609 (1996)
Aharon, M., Elad, M., Bruckstein, A.: K-SVD and its nonnegative variant for dictionary design. In: Proceedings of the SPIE Conference Wavelets, vol. 5914, pp. 327–339 (July 2005)
Hoyer, P.O.: Non-negative matrix factorization with sparseness constraints. Journal of Machine Learning Research, 1457–1469 (2004)
Lee, D.D., Seung, H.S.: Learning the parts of objects by nonnegative matrix factorization. Nature 401, 788–791 (1999)
Lee, D.D., Seung, H.S.: Algorithms for non-negative matrix factorization. In: Advances in Neural Information Processing Systems, pp. 556–562 (2001)
Peharz, R., Stark, M., Pernkopf, F.: Sparse nonnegative matrix factorization using ℓ0-constraints. In: 2010 IEEE International Workshop on Machine Learning for Signal Processing (MLSP), pp. 83–88 (September 2010)
Gillis, N.: Nonnegative Matrix Factorization: Complexity, Algorithms and Applications. PhD thesis, Université catholique de Louvain (2011)
Berry, M.W., Browne, M., Langville, A.N., Pauca, V.P., Plemmons, R.J.: Algorithms and applications for approximate nonnegative matrix factorization. Computational Statistics & Data Analysis 52(1), 155–173 (2007)
Aharon, M., Elad, M., Bruckstein, A.: K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation. IEEE Trans. on Signal Processing 54(11), 4311–4322 (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Tang, Z., Ding, S. (2012). Nonnegative Dictionary Learning by Nonnegative Matrix Factorization with a Sparsity Constraint. In: Wang, J., Yen, G.G., Polycarpou, M.M. (eds) Advances in Neural Networks – ISNN 2012. ISNN 2012. Lecture Notes in Computer Science, vol 7368. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31362-2_11
Download citation
DOI: https://doi.org/10.1007/978-3-642-31362-2_11
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-31361-5
Online ISBN: 978-3-642-31362-2
eBook Packages: Computer ScienceComputer Science (R0)