Skip to main content
Log in

Dispersion Constraint Based Non-negative Sparse Coding Model

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Based on advantages of basic non-negative sparse coding (NNSC) model, and considered the prior class constraint of image features, a novel NNSC model is discussed here. In this NNSC model, the sparseness criteria is selected as a two-parameter density estimation model and the dispersion ratio of within-class and between-class is used as the class constraint. Utilizing this NNSC model, image features can be extracted successfully. Further, the feature recognition task by using different classifiers can be implemented well. Simulation results prove that our NNSC model proposed is indeed effective in extracting image features and recognition task in application.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

References

  1. Hoyer PO (2003) Modelling receptive fields with non-negative sparse coding. Neurocomputing 52:547–552

    Article  Google Scholar 

  2. Olshausen BA, Field DJ (1996) Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature 381:607–609

    Article  Google Scholar 

  3. Li L, Zhang YJ (2009) SENSC: a stable and efficient algorithm for nonnegative sparse coding. Acta Autom Sin 35:439–443

    MathSciNet  Google Scholar 

  4. Lee DD, Seng HS (1999) Learning the parts of objects by non-negative matrix factorization. Nature 401:788–891

    Article  Google Scholar 

  5. Cao J, Lin Z (2014) Bayesian signal detection with compressed measurements. Inform Sci 289:241–253

    Article  Google Scholar 

  6. Shang Li (2008) Non-negative sparse coding shrinkage for image denoising using normal inverse Gaussian density model. Image Vis Comput 26:1137–1147

    Article  Google Scholar 

  7. Hyvärinen A (1997) Sparse coding shrinkage: denoising of nongaussian data by maximum likelihood estimation. Neural Comput 11:1739–1768

    Article  Google Scholar 

  8. Cao J, Chen T, Fan J (2014) Fast online learning algorithm for landmark recognition based on BoW framework. In: Proceedings of the 9th IEEE Conference on Industrial Electronics and Applications. Hangzhou, China, June 2014, pp 1163–1168

  9. Huang GB, Zhu Q, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501

    Article  Google Scholar 

  10. Cao J, Xiong L (2014) Protein sequence classification with improved extreme learning machine algorithms, BioMed Research International, vol. 2014, Article ID 103054, 12 pages. doi:10.1155/2014/103054

  11. Huang GB, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70:3056–3062

    Article  Google Scholar 

  12. Huang GB, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71:3460–3468

    Article  Google Scholar 

  13. Cortes C, Vapnik VN (1995) Support vector networks. Mach Learn 20:273–297

    MATH  Google Scholar 

  14. Chen GY, Xie WF (2006) Pattern recognition with SVM and dual-tree complex wavelets. Image Vis Comput 25:960–966

    Article  Google Scholar 

  15. Zhang L, Zhou W, Jiao L (2004) Wavelet support vector machine. IEEE Trans Syst, Man, Cybern—Part B 34:34–39

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by the National Natural Science Foundation of China (Nos. 61373098, 61370109).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Li Shang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, X., Wang, C., Shang, L. et al. Dispersion Constraint Based Non-negative Sparse Coding Model. Neural Process Lett 43, 603–609 (2016). https://doi.org/10.1007/s11063-015-9432-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-015-9432-7

Keywords

Navigation