Normal sparse Deep Belief Network | IEEE Conference Publication | IEEE Xplore

Normal sparse Deep Belief Network


Abstract:

Nowadays this is very popular to use deep architectures in machine learning. Deep Belief Networks (DBNs) have deep architectures to create a powerful generative model usi...Show More

Abstract:

Nowadays this is very popular to use deep architectures in machine learning. Deep Belief Networks (DBNs) have deep architectures to create a powerful generative model using training data. Deep Belief Networks can be used in classification and feature learning. A DBN can be learnt unsupervised and then the learnt features are suitable for a simple classifier (like a linear classifier) with a few labeled data. According to researches, training of DBN can be improved to produce features with more interpretability and discrimination ability. One of these improvements is sparsity in learnt features in DBN. By using sparsity we can learn useful low-level feature representations for unlabeled data. In sparse representation we benefit from this property that the learnt features can be interpreted, i.e. they correspond to meaningful aspects of the input, and capture factors of variation in the data. Different methods have been proposed to build sparse RBMs. In this paper we propose a new method namely nsDBN that has different behaviors according to deviation of the activation of the hidden units from a (low) fixed value. Also our proposed method has a variance parameter that can control the force degree of sparseness. According to the results, our new method compared to the state of the art methods including PCA, RBM, qsRBM, and rdsRBM always achieves the best recognition accuracy on the MNIST hand written digit recognition test set even when only 10 to 20 labeled samples per class are used as training data.
Date of Conference: 12-17 July 2015
Date Added to IEEE Xplore: 01 October 2015
ISBN Information:

ISSN Information:

Conference Location: Killarney, Ireland

Contact IEEE to Subscribe

References

References is not available for this document.